Elizabeth Denham, the UK Information Commissioner, has said she is “deeply concerned” about the use of live facial recognition (LFR). Commenting in a blog post, Ms Denham addressed privacy worries over the use of live facial recognition technology in public places. In the post, she said that when “technology and its algorithms are used to scan people’s faces in real time and in more public contexts, the risks to people’s privacy increases.” She also said she was concerned that LFR might be used “inappropriately, excessively or even recklessly”.
“I am deeply concerned about the potential for live facial recognition (LFR) technology to be used inappropriately, excessively or even recklessly. When sensitive personal data is collected on a mass scale without people’s knowledge, choice or control, the impacts could be significant.
“We should be able to take our children to a leisure complex, visit a shopping centre or tour a city to see the sights without having our biometric data collected and analysed with every step we take.
“Unlike CCTV, LFR and its algorithms can automatically identify who you are and infer sensitive details about you. It can be used to instantly profile you to serve up personalised adverts or match your image against known shoplifters as you do your weekly grocery shop.
“In future, there’s the potential to overlay CCTV cameras with LFR, and even to combine it with social media data or other “big data” systems – LFR is supercharged CCTV.
“It is not my role to endorse or ban a technology but, while this technology is developing and not widely deployed, we have an opportunity to ensure it does not expand without due regard for data protection.”
Elizabeth Denham, UK Information Commissioner
New guidance on LFR technology
The post came as new guidance for companies and public organisations using LFR technology was published by the ICO.
The advice, entitled the use of live facial recognition technology in public places explains how data protection law applies to this complex and novel type of data processing. This comes with the revelation that, out of six ICO investigations into LFR systems, none were fully compliant with data protection law.
The problem with facial recognition software
There are several ethical and privacy concerns over the use of facial recognition software – especially when it comes to profiling and automated decision-making about individuals.
For example, LFR technology can send instant alerts when “subjects of interest” enter a premises or area. Big Brother Watch – a British civil liberties and privacy campaigning organisation – described how one black schoolboy was “swooped by four officers, put up against a wall, fingerprinted, phone taken, before police realised the face recognition had got the wrong guy”.
Indeed, women and people of BAME groups are routinely being discriminated against by companies that use such tech. This is because leading facial recognition software packages are much worse at identifying women and people of colour than classifying male, white faces.
At Keller Lenkner UK, we make sure our clients are compensated for any GDPR violations that impact their legal rights. And if you have been harmed because of facial recognition software and algorithmic and automated decision-making processes, we can help.
For example, we are supporting Uber drivers in England & Wales who have GDPR concerns over Uber’s facial recognition software, algorithmic accountability, and automated decision-making processes.
Commenting on the use of LFR technology, our head of data breach, Kingsley Hayes, said:
“Under the General Data Protection Regulation (GDPR), the processing of biometric data such as images of a person’s face, and the use of automated decision-making, including profiling, are only allowed in very explicit circumstances. By discriminating against individuals and automatically making decisions that harm them, such technology is not GDPR compliant.
“We welcome the opinion on the Information Commissioners on this matter and hope that the new guidance will help to enforce more robust data protections where LFR systems are being used.”