Skip to main content
opinion
Open this photo in gallery:

Algorithmic management in the workplace isn’t all that new, but regulators have been too slow to respond to it.Agustin Marcarian/Reuters

Vass Bednar is a contributing columnist for The Globe and Mail and host of the new podcast, Lately. She is the executive director of McMaster University’s master of public policy program and co-author of the forthcoming book The Big Fix: How Companies Capture Markets and Harm Canadians.

It can be exhilarating to pull the handle on a slot machine and see where the dials land. But that same uncertainty has become a defining feature of app-based work, which is increasingly managed by opaque and unaccountable algorithms that constantly calibrate the price of work. The micropayments for the various tasks completed through these platforms is gamified and randomized. It’s not fun.

Imagine doing the same type of work every day, but dealing with modest, unexplainable discrepancies in your pay for totally identical tasks. It’s never clear how much you will earn for that draft, that e-mail, that presentation or closing out a complex task in the company’s project management software. While it’s somewhat normal for work to be stressful, it’s unfair and exploitative for remuneration to be so unpredictable. But the larger injustice is that we’re not really doing anything about this practice embedded in the gig economy – and our inaction could let it spill into white-collar work.

This isn’t about, say, paying Uber drivers more or paying them less based on time of day, location or distance travelled. A recent video from the U.S.-based non-profit More Perfect Union showed how gig platforms such as Uber and Lyft are paying different drivers slightly different amounts for the exact same ride.

Algorithmic management in the workplace isn’t all that new, but regulators have been too slow to respond to it. And this is a problem.

While this form of work isn’t for everyone, gig platforms offer individuals flexibility and have low barriers to entry that allow people to start earning quickly – desirable qualities that shouldn’t come at the cost of a wage floor and payment predictability.

And the rest of us keep using the apps to get food delivered, take rides and schedule home cleanings; perpetuating the manipulation of workers using Handy, TaskRabbit and other programs to get connected to jobs.

Veena Dubal, a law professor at the University of California’s Irvine School of Law, calls this algorithmic wage discrimination – the practice of determining individual workers’ earnings through dynamic formulas that continuously adjust based on detailed data such as location, personal behaviour, demand, supply and other variables – for broadly similar work.

What happens when it finally migrates outside of the gig economy? It’s the next step for data-driven exploitative practices that are being adopted in the absence of policy direction. Wages risk becoming ruthlessly randomized, unless we step up and protect workers.

In its logistics and warehouse operations, Amazon already uses algorithms to set wage incentives, using data-driven performance metrics that can influence hourly wages, bonuses and even job retention. It’s dynamic pricing, but for work.

While salaries and wages may always have some natural variance owing to seniority and training, software programs that monitor the availability and allocation of work can create different incentives for various work tasks. This allows firms to differentiate wages for workers in ways that are unknown or unclear to them, making payment unpredictable and opaque.

It’s normal for firms to pursue profit maximization, and we should expect companies to squeeze as much as they can from workers and consumers. But the information asymmetry between software programs and the people using them to work perpetuates an unreasonable power imbalance.

Jurisdictions such as Italy have introduced fines for companies that deploy this technology, faulting firms for not adequately explaining how their algorithm works and for not providing workers with mechanisms to challenge decisions made by algorithms. The EU’s Platform Work Directive introduces greater transparency over algorithmic management systems.

But that’s about the extent of such measures. In the absence of meaningful policy action to level the playing field, workers are left guessing. In 2021, more than 26,000 DoorDash workers in the U.S. organized on Facebook to decline all low-paying offers in an effort to “game” the algorithm that determines their payment. These workers have said that the effort is increasing their average pay per order.

While the federal government completed a consultation on developing greater protections for gig workers that engaged on the need to improve fairness and transparency on digital platforms, Ontario has been working to establish a general minimum wage for digital platform workers for the past two years.

We have a massive opportunity to innovate and modernize labour standards to protect workers from the inevitable creep of this algorithmic micro-management and make this kind of work more humane. The least we can do is make these invisible strings perceptible and predictable, before we’re all tangled up in them. Time for a wild bet.

Follow related authors and topics

Interact with The Globe