There are several ethical and privacy concerns over the use of facial recognition software – especially when it comes to profiling and automated decision-making about individuals.
In particular, women and people of BAME groups are being discriminated against by companies that use such tech. This is because leading facial-recognition software packages are much worse at identifying women and people of colour than at classifying male, white faces.
Such technology is also being used by some of the biggest companies in the gig economy sector – often to the detriment of workers.
Under the General Data Protection Regulation (GDPR), the processing of biometric data (such as images of a person’s face), and the use of automated decision-making, including profiling, are only allowed in very explicit circumstances. By discriminating against individuals and automatically making decisions that harm them, such technology is not GDPR compliant.
At Keller Lenkner UK, we make sure our clients are compensated for any GDPR violations that impact their legal rights. And if you have been harmed because of facial recognition software and algorithmic and automated decision-making processes, we can help.