The science fiction writer Isaac Asimov once proposed three laws of robotics, starting with this clear directive: “A robot may not injure a human being.”
A question for society right now is whether that rule should apply to algorithms, that is to say, robot software. Should artificial intelligence be allowed to injure a human being? This pressing question applies, perhaps surprisingly, to the expanding gig economy.
The documentary we’re watching this month, “The Gig Is Up,” written and directed by Canadian filmmaker Shannon Walsh, is a global exploration of the opportunity, and sometimes the heartbreaking cost, for workers in app-driven jobs. There’s the overriding question about whether the pay for driving for a car service or delivering food for restaurants is appropriate or fair. There are significant health and safety issues. And here’s a piece that demands consideration: The algorithms that rate a gig worker’s performance. How does that work? And what is the recourse if a robotic performance reviewer decides that a gig worker is no good?
There’s a scene in the film showing people who are directed by app to ferry fast food to customers by bicycle. Consider just one bane of these delivery cyclists’ existence: cups with lids that pop open when jostled. The goal is to get KFC or McDonald’s through traffic on two wheels, which inevitably leads to spillage of Coca-Cola, which leads, in turn, to poor customer reviews. These digital demerits, processed by the delivery platform’s algorithm, can add up to the point that a delivery person gets suspended or loses the gig entirely. One lesson here is that customers should think more carefully before knocking someone by using a single star — the problem may not be the delivery person’s incompetence, but the system as designed.
Algorithms may be good, but they cannot be perfect. Worker Info Exchange, an advocacy group for gig workers based in the U.K., just published a report, “Managed by Bots: Data-Driven Exploitation in the Gig Economy.” It contains accounts by workers in car service, food delivery and other fields who say they fell into a Kafkaesque hellscape when they tried to dispute what they saw as their erroneous banning.
The European Union requires gig platforms to make sure that systems used to judge and discipline workers are open and transparent. But The Financial Times notes that legal cases in Italy, the Netherlands and beyond suggest companies do not always follow the rules. This month, Worker Info Exchange and London-based Privacy International started a public campaign to push some big gig platforms to reveal how decisions made by artificial intelligence can be questioned and reviewed. Some companies have said they enlist humans to make final decisions.
There’s a much-discussed episode of the dystopian TV series “Black Mirror” set in a near future where big data on individuals is aggregated into a single numerical score. That score can guide everything from credit ratings to whether you’re allowed a certain seat on a plane.
If science fiction doesn’t get your attention, consider reality. We’ve reported that China has developed a social credit system that dings people for poor credit, yes, but also for showing up at a political protest or crossing the street against the light. A lower social credit score can even hurt a person’s chances of finding an online date.
It may be that we are all headed into what could be called the algorithm economy.
“The Gig Is Up” is available on a variety of streaming platforms. If you’ve watched it, let the Econ Extra Credit team know what you thought: email@example.com. We’ll feature a selection of responses in an upcoming newsletter.