fbpx

Facial Recognition, AI & Algorithms

At Keller Lenkner UK, we support clients who have experienced GDPR violations because of facial recognition software and algorithmic and automated decision-making processes.

Have you experienced harm due to automated decision making and facial recognition software? If so, we can help.

There are several ethical and privacy concerns over the use of facial recognition software – especially when it comes to profiling and automated decision-making about individuals.

In particular, women and people of BAME groups are being discriminated against by companies that use such tech. This is because leading facial-recognition software packages are much worse at identifying women and people of colour than at classifying male, white faces.  

Such technology is also being used by some of the biggest companies in the gig economy sector – often to the detriment of workers. 

Under the General Data Protection Regulation (GDPR), the processing of biometric data (such as images of a person’s face), and the use of automated decision-making, including profiling, are only allowed in very explicit circumstances. By discriminating against individuals and automatically making decisions that harm them, such technology is not GDPR compliant.

At Keller Lenkner UK, we make sure our clients are compensated for any GDPR violations that impact their legal rights. And if you have been harmed because of facial recognition software and algorithmic and automated decision-making processes, we can help.

Why claim GDPR violation compensation?

Hold organisations to account for failing to protect or misusing your private information.

Receive financial compensation for your losses.

Force organisations to implement better data processes.

Management by Bots Harms Worker

Today, many people are seeing increased surveillance from their employers. Monitoring systems are being used due to an increase in home-working, and big-tech companies are using dehumanising surveillance tools to track their gig-economy workers.  

Because of failures in algorithmic management tools, some gig economy workers have been locked out and left unable to work, or even fired – all through no fault of their own. And with automated processes in place, many are left without the right of appeal.  

At Keller Lenkner UK, we are standing up to these companies by taking action against the use of AI, algorithms and facial recognition software in the gig economy workplace. 

Examples of Automated Decision Making by
Facial Recognition Software

Uber

Drivers working for Uber and Uber Eats must use facial identification software to access the Uber system. But some BAME drivers have claimed that the facial recognition technology is costing them their livelihoods as the software is incapable of recognising their faces.

After a failed ID test, some drivers have been threatened with termination, had their accounts frozen and left unable to work, or even permanently fired. The drivers also allege that the process is automated and that they are left without any right to appeal.

Similar software to that used by Uber has shown a 20.8% failure rate for darker-skinned female faces, 6% for darker-skinned males, and 0% for white men.

FIND OUT MORE

Facial Recognition Security Systems

Facial recognition technology can send instant alerts when “subjects of interest” enter a premises or area. However Big Brother Watch – a British civil liberties and privacy campaigning organisation – described how one black schoolboy was “swooped by four officers, put up against a wall, fingerprinted, phone taken, before police realised the face recognition had got the wrong guy”.

START YOUR NO-WIN-NO-FEE GDPR BREACH CLAIM TODAY

Why use Keller Lenkner UK to make a data breach, GDPR violation, or cybercrime claim?

Latest news

Uber Keller Lenkner Press

Uber loses judgment over algorithmic firings

In February 2021, Uber lost a judgement in the Netherlands where it was challenged over the alleged ‘robo-firings’ of drivers. The Court of Amsterdam ordered Uber to reinstate six drivers who claim they were unfairly terminated “by algorithmic means.” Uber was also ordered to pay the fired drivers compensation. Uber had until March 29 to comply with the order, but failed to do so.

Read More »

START YOUR NO-WIN-NO-FEE GDPR BREACH TODAY