What if your boss was an algorithm? What would you do if your employer suddenly fired you or reduced your pay without telling you why? And without being willing to give you a reason when you ask for one?
This is not science fiction or some far-fetched reality. Millions of people worldwide
are working in the gig economy sector for companies like Uber, Deliveroo, Bolt, Just Eat… And this could be the future of work for people working outside the gig economy, as surveillance technologies are creeping into the workplace – and the ‘work-from-home place’ in particular.
Who we are working with
To counter the surveillance that employers are subjecting workers to, and the power imbalance that workers face, we have partnered with Worker Info Exchange
and App Drivers and Couriers Union
, who have been working on these issues and fighting to protect rights of gig economy workers.
Worker Info Exchange
(WIE) is a London-based non-profit organisation founded by James Farrar, an activist for worker rights in the gig economy, who took Uber to the UK Supreme Court and won
. The win forced Uber to recognise drivers as workers instead of self-employed contractors. Farrar understands that data is power, especially as Uber reportedly
keeps workers in the dark about how the company makes decisions about work allocation, identity verification, and firing.
The App Drivers and Couriers Union
(ADCU) was set up in 2013 and is the UK’s largest trade union for licensed private hire drivers and couriers. The Union is concerned that drivers are mistreated by their employers and aims to protect their rights, demand change, and ensure that their collective voice is heard.
To understand the issues faced by gig economy workers it is important to first understand how companies collect and use data to make decisions about their workers, including how work is allocated, how much money drivers are able to earn, and more.
The research undertaken so far
To challenge algorithmic management and demonstrate its impact, WIE has been using data subject access requests (DSARs) as a method of redress.
Algorithmic management can be defined as a set of technological tools and techniques to remotely manage workforces, relying on data collection and surveillance of workers to enable automated or semi-automated decision-making
A DSAR is a tool provided by European data protection laws that allows individuals to obtain personal data as well as a series of other information held on them by companies or organisations. You can find out more about subject access requests and how to submit one by exploring our guide here
WIE filed over 500 DSARs to seven companies to enable gig economy workers to better understand what type of information their employers hold on them and whether their employers are willing to comply with data protection and privacy obligations when it comes to data access. These companies were Amazon Flex, Bolt, Deliveroo, Free Now, Just Eat, Ola, and Uber.
PI has partnered with WIE and the ADCU to voice the concerns stemming from their research. To do this, we interviewed and filmed Uber drivers. Together, we tell the stories of drivers’ experiences when trying to understand why their employers took certain decisions about their employment. Some decisions had onerous consequences.
We have reached out to Uber about our interviews and you can read Uber’s response below. In their response Uber has emphasised that the aim of the technology is to keep everyone, both riders and drivers safe.
Key findings from WIE’s research
Worker Info Exchange has published their latest report
in a unique context: courts across Europe – in Italy, Spain, and the UK are ruling against gig economy companies and in favour of workers to recognise their status and grant them more rights. However, while courts in Europe are waking up to the reality of algorithmic management the recent gains still do little to protect workers against its harms.
The picture that WIE’s research paints is one where algorithms are shaping the work experience of drivers, offering them limited visibility or avenues for redress when a decision about them is made. We are concerned that employers highlighted in the report seemed to be hiding behind the decisions that opaque and automated systems make, and potentially avoiding full accountability for this.
According to WIE’s report, several gig economy employers seem reluctant to fully comply with their data protection obligations. Some companies did not provide all of the data requested. WIE have been unable to obtain information about how algorithms calculate a score which is then used to prioritise dispatch of journeys to drives. Some companies also failed to provide the guidance documents or location data that is gathered by their monitoring systems. Further, in order to obtain data from some companies, WIE had to go to great lengths and even file numerous complaints with the data protection authority in order for the companies to act upon their requests.
These findings depict a worrying landscape: they suggest that the process of receiving data from the gig-economy platforms may often prove time consuming and resource intensive, even though the legal framework is designed to facilitate such requests and make them an easy, accessible tool for everyone to use.
It is concerning that drivers may be unable to find out what information is collected about them, how that information may be later used to make sometimes negative decisions about their employment, with little ability to access redress. For more details about WIE’s findings you can read their full report here
Finally, WIE’s report demonstartes that the surveillance that drivers experience is not just vast data collection, but also the use of more invasive technologies. The report provides specific examples where facial recognition technology ended up locking drivers out of their account due to potential identity verification failures.
How you can be involved
Worker Info Exchange’s latest report and our interviews with the private hire drivers paint a worrying picture. We are concerned that Uber and other companies might not be fully transparent about the data they collect or how they use it. This situation results in “management by bots”: where workers are suspended because a computer said so and drivers have little to no opportunity for redress. Employers appear to use algorithms in a way that seems to shift the burden onto the workers to prove that they are innocent, instead of the management having to prove that they are guilty. Employers need to take responsibility for the decisions their algorithms make and to be accountable for those decisions.
This isn’t solely the fate of gig economy workers: in fact, especially with the rise of remote working during the Covid-19 pandemic, many of these practices – including the use of facial recognition – could become the reality of workers across various sectors. That’s why we are calling for employers to do better than managing by bots.
Let’s stand together and make sure no one is subjected to dehumanising management by bots.
*Via Privacy International*