Uber has been accused of using automated algorithms to fire drivers, something which if true would be a gross violation of GDPR.
The App Drivers and Couriers Union (ADCU) is taking Uber to court with the goal of overruling the algorithm that has cost over 1000 drivers their jobs.
Uber claims that its termination decisions are made by humans, but if this is proved false Uber will have to answer as to why it did not give users the chance to object to automated decisions.
Anton Ekker, privacy lawyer heading up the case said: “If it is automated decision-making, then the GDPR says they must have legal grounds to use such technology, and they must give drivers the possibility to object to an automated decision, which they clearly did not do.”
His intention is to seek a ruling from the Dutch courts, which, if successful, would then make it possible to bring a class action lawsuit against Uber.
The specific infraction refers to Article 22 of the EU GDPR:
1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.
2. Paragraph 1 shall not apply if the decision:
(a) is necessary for entering into, or performance of, a contract between the data subject and a data controller;
(b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or
(c) is based on the data subject’s explicit consent.
3. In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.
4. Decisions referred to in paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless point (a) or (g) of Article 9(2) applies and suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are in place.
If we refer to paragraph 3 we can see that Uber is required to allow the data subject (the driver) to obtain human intervention in the matter of contesting a decision.
The driver experience
An anonymous former Uber driver said that he had been driving with Uber for about two years and had a customer rating of 4.94 when he was suddenly terminated from the app.
“The day it happened, I went to work and on my app, it said I wasn’t allowed to log in. The app said to call customer support. I rang customer support and I was told that my account was deactivated because I had been engaging in fraudulent activities.”
He claimed to have contacted Uber more than 50 times without ever being told what he had done that was fraudulent.
Customer Support told him that a “specialised team” was dealing with the issue, but he never received any information from said team.
“I was pleading with them in my emails repeatedly. I even asked if I could have a face-to-face meeting with the specialised team. I was willing to travel to another country to meet them. I have a family to feed. I’m not a fraudster or a criminal.”
The ADCU’s general secretary, James Farrar explained further the problem faced by drivers on the receiving end of random Uber terminations:
“For any private hire operator in London, if they fire someone, there is a requirement where they have to report the driver to Transport for London (TfL). This is putting drivers in a Kafkaesque situation where they may be called in by TfL, they’re given 14 days to explain the situation and why they should keep their licence. Our drivers are in a terrible position because they don’t know what the issue is, Uber hasn’t told them.”
Farrar also claims Uber would not provide additional details as to why drivers get dismissed to TfL for security reasons.
An Uber spokeswoman stated that the company does provide “personal data and information that individuals are entitled to,” further adding:
“We will give explanations when we cannot provide certain data, such as when it doesn’t exist or disclosing it would infringe on the rights of another person under GDPR. As part of our regular processes, the drivers in this case were only deactivated after manual reviews by our specialist team.”
Professor Lilian Edwards, chair of Law, Innovation and Society at Newcastle University, claimed that ADCU’s legal challenge could set a precedent with the European Court of Justice.
“This is probably the biggest case we’ve had so far on Article 22 of the GDPR that’s ever gotten to the courts. Article 22 is really important because this is the provision that arguably gives you the right to an explanation about why an automated decision was made about you. There’s been huge debate for years about whether the law could give people some rights over it, and this is a way for us to get some control over it and to be able to challenge it if it’s wrong. So this is really big news.”