Automated Decision Making by Facial Recognition Software

At Keller Lenkner UK, we support clients who have experienced GDPR violations because of facial recognition software and algorithmic and automated decision-making processes.

Have you experienced harm due to automated decision making and facial recognition software? If so, we can help.

There are several ethical and privacy concerns over the use of facial recognition software – especially when it comes to profiling and automated decision-making about individuals.

In particular, women and people of BAME groups are being discriminated against by companies that use such tech. This is because leading facial-recognition software packages are much worse at identifying women and people of colour than at classifying male, white faces.  

Under the General Data Protection Regulation (GDPR), the processing of biometric data (such as images of a person’s face), and the use of automated decision-making, including profiling, are only allowed in very explicit circumstances. By discriminating against individuals and automatically making decisions that harm them, such technology is not GDPR compliant.

At Keller Lenkner UK, we make sure our clients are compensated for any GDPR violations that impact their legal rights. And if you have been harmed because of facial recognition software and algorithmic and automated decision-making processes, we can help.

Why claim GDPR violation compensation?

Hold organisations to account for failing to protect or misusing your private information.

Receive financial compensation for your losses.

Force organisations to implement better data processes.

Examples of Automated Decision Making by
Facial Recognition Software


Drivers working for Uber and Uber Eats must use facial identification software to access the Uber system. But some BAME drivers have claimed that the facial recognition technology is costing them their livelihoods as the software is incapable of recognising their faces.

After a failed ID test, some drivers have been threatened with termination, had their accounts frozen and left unable to work, or even permanently fired. The drivers also allege that the process is automated and that they are left without any right to appeal.

Similar software to that used by Uber has shown a 20.8% failure rate for darker-skinned female faces, 6% for darker-skinned males, and 0% for white men.


Facial Recognition Security Systems

Facial recognition technology can send instant alerts when “subjects of interest” enter a premises or area. However Big Brother Watch – a British civil liberties and privacy campaigning organisation – described how one black schoolboy was “swooped by four officers, put up against a wall, fingerprinted, phone taken, before police realised the face recognition had got the wrong guy”.


Why use Keller Lenkner UK to make a data breach, GDPR violation, or cybercrime claim?

Latest news

Uber loses judgment over algorithmic firings

In February 2021, Uber lost a judgement in the Netherlands where it was challenged over the alleged ‘robo-firings’ of drivers. The Court of Amsterdam ordered Uber to reinstate six drivers who claim they were unfairly terminated “by algorithmic means.” Uber was also ordered to pay the fired drivers compensation. Uber had until March 29 to comply with the order, but failed to do so.

Read More »