Ex-Uber driver takes legal action over ‘racist’ face-recognition software
An Uber driver who lost his job when automated face-scanning software failed to recognise him is accusing the firm of indirect race discrimination in a legal test case.
The black driver, who worked on the Uber platform from 2016 until April 2021, has filed an employment tribunal claim alleging his account was illegally deactivated when facial-verification software used to log drivers on to the ride-hailing app decided he was not who he said he was.
The Independent Workers’ Union of Great Britain (IWGB), which is backing the action, claimed at least 35 other drivers had had their registration with Uber terminated as a result of alleged mistakes with the software since the start of the pandemic. It is calling for Uber to scrap the “racist algorithm” and reinstate terminated drivers.
Uber said it “strongly refutes the completely unfounded claims” and that it was “committed to fighting racism and being a champion for equality – both inside and outside our company”. The firm said the checks were “designed to protect the safety and security of everyone who uses the app by ensuring the correct driver is using their account”. Drivers can choose human verification of their picture, and when technology is chosen “there is always a minimum of two human expert reviews prior to any decision to remove a driver”, she said.
Uber has used the software since April 2020. In 2019 Microsoft, which makes the software, conceded facial recognition software did not work as well for people of colour and could fail to recognise them.
Studies of several facial recognition software packages have shown that error rates when recognising people with darker skin have been higher than among lighter-skinned people, although Microsoft and others have been improving performance. Uber said its software did not rely on scanning large numbers of faces, which had been blamed for introducing error. Rather it verified an already uploaded picture of the driver against their freshly submitted selfie.
In London, nine out of 10 private hire drivers are black or black British, Asian or Asian British, or of mixed race, according to a recent survey by TfL.
“Uber’s continued use of a facial recognition algorithm that is ineffective on people of colour is discriminatory,” said Henry Chango Lopez, general secretary of the IWGB. “Hundreds of drivers and couriers who served through the pandemic have lost their jobs without any due process or evidence of wrongdoing.”
A Nigerian driver who worked on the Uber Eats platform in Manchester until he was locked out in March after several failed attempts using the facial verification software, said his family had faced “serious suffering” as a result.
Abiodun Ogunyemi, a married father of three, said he had run up debts so large he could not afford his son’s bus fare to get to school. He says the photo on Uber’s records did not feature the longer hair or beard he currently has, but he has a distinctive scar over one eye and the rest of his face is visible. “I feel the algorithm is discriminatory to people of colour,” he said. “I know about five black people the same thing has happened to.”
Uber said anyone removed from the platform could appeal against the decision, with an additional human review.
On 10 April the driver in the test case, who asked not be named, tried to log on for work by submitting a photo through the app, but received a message from Uber saying he had failed to verify his identity and he was locked out of the system for 24 hours. He submitted a second photo after that period, which did not work either.
According to his claim, four days later his account was deactivated and he was sent a message stating: “Our team conducted a thorough investigation and the decision to end the partnership has been made on a permanent basis. The matter is not subject to further review.”
His case is also being backed by the Black Lives Matter organisation which said: “The gig economy, which already creates immense precarity for Black key workers, is now further exacerbated by this software.”
Microsoft declined to comment on an ongoing legal case.