Sneaky Algorithms – How to Pay Gig Workers Less
“algorithmic wage discrimination”
One of most troubling aspects of using algorithms to make economic or demographic decisions, common in all sorts of online and mobile apps, is that racial, ethnic and gender biases – inherent and perhaps subconscious in the minds of the programmers – often sneak into the software design. Even the vocabulary selected to express options on a website or mobile app carries that inherent bias. But what happens when those biases are intentional? Where cost-cutting and efficiency measurements may be the stated goals but where those metrics actually wind up with those biases on steroids?
In the gig economy, those biased programs are working overtime to cut corners. According to new research (to be published in the Columbia Law Review) from Veena Dubal, a law professor at the University of California (Hastings) in San Francisco, “UBER EATS and Postmates use black-box algorithms to determine how drivers get paid. As a result, the amount they earn for a delivery is going to be different each time and will differ from what co-workers earn.” Brian Merchant, in the April 11th Los Angeles Times. What’s even more remarkable, those algorithms constantly update themselves… so that the same delivery by the same individual gig worker just might result in very different net pay rates at different times.
Companies use algorithms to analyze consumer behavior. Netflix, for example, has a set of ultra-secret algorithms that they use to determine what programming to produce. They have metrics on the individual actors, the writers, the genre, the directors, etc. While the entire decision is not determined by a computer, the variables are definitely taken into consideration when decisions are made. The order patterns and viewing habits of individual members of a household constantly upgrade those analytics. That metadata is exceptionally valuable. Tracking consumer behavior, your behavior, is a massively lucrative industry.
But if you are a gig worker, in the delivery or ride-share business, what you get paid is not remotely within your control. “Dubal calls this ‘algorithmic wage discrimination,’ and it’s a pernicious trend that has flown under the radar for too long. It’s a phenomenon that, she says, can reduce your pay, undermine efforts to organize your workplace, and exacerbate racial and gender discrimination. And it stands to be supercharged by the rise of AI.
“In her paper… Dubal details this new kind of wage discrimination and what it looks like in practice. It starts with data collection… Companies such as Uber, Instacart and Amazon are constantly collecting reams of granular data about the contract workers who use their platforms — where they live and work, what times of day and for how long they tend to work, what their earnings targets are and which kinds of jobs they are willing to accept. Dubal said these companies ‘use that data to personalize and differentiate wages for workers in ways unknown to them.’
“In most cases, workers are given only two choices for each job they’re offered on a platform — accept or decline — and they have no power to negotiate their rates. With the asymmetric information advantage all on their side, companies are able to use the data they’ve gathered to ‘calculate the exact wage rates necessary to incentivize desired behaviors.’
“One of those desired behaviors is staying on the road as long as possible, so workers might be available to meet the always-fluctuating levels of demand. As such, Dubal writes, the companies are motivated ‘to elongate the time between sending fares to any one driver’ — just as long as they don’t get so impatient waiting for a ride they end their shift. Remember, Uber drivers are not paid for any time they are not ‘engaged,’ which is often as much as 40% of a shift, and they have no say in when they get offered rides, either. ‘The company’s machine-learning technologies may even predict the amount of time a specific driver is willing to wait for a fare,’ Dubal writes.
“If the algorithm can predict that one worker in the region with a higher acceptance rate will take that sushi delivery for $4 instead of $5 — they’ve been waiting for what seems like forever at this point — it may, according to the research, offer them a lower rate. If the algorithm can predict that a given worker will keep going until he or she hits a daily goal of $200, Dubal says, it might lower rates on offer, making that goal harder to hit, to keep them working longer… This is algorithmic wage discrimination… ‘It’s basically variable pay that’s personalized to individuals based on what is really, really a lot of data that’s accumulated on those workers while they’re working,’ Dubal says.” LA Times.
As we grow increasingly concerned that artificial intelligence will replace even high-level professional services, it is interesting (in a bad way) to understand these little intrusions into our daily lives. But what is increasingly troubling is how these variables operate in relative secrecy and are based on data that is generated by peering into the back windows of our online lives. Metadata is bought and sold by the ton… and they have a great deal of personal information on each and every one of us… and even our children.
I’m Peter Dekom, and technology is so far beyond our ability to legislate to control it that those with this personal information have more power over us than we could possibly imagine.
No comments:
Post a Comment