Data Protection and Facial Recognition
Patricia Jones
14/08/2019

The Mayor of London has recently written to the owners of a 67 acre site in Kings Cross London regarding their use of facial recognition technology.

Live facial recognition (LFR) involves the processing of personal data; the biometric data of a large number of people is captured and screened against a database to identify people of interest.  It has significant data protection and privacy implications. This is a high priority area for the ICO; it is currently investigating its use by the police and has intervened in the case of R (Bridges) v Chief Constable of South Wales Police relating to whether the use of LFR by the police is lawful. The ICO has indicated that once judgment in this case is given it will report on the findings of its investigation and set out what action needs to be taken. This will also have implications for private companies.

It’s not just the police that are the subject of the ICO’s focus; it is also considering the use of LFR in public spaces by private companies, just as in Kings Cross. The Information Commissioner has given guidance for police forces considering LFR which includes carrying out data protection impact assessments and ensuring the algorithms within the software do not treat the race or sex of individuals unfairly. The same considerations apply to private companies who will also have to identify the lawful basis upon which they rely to process the personal data.  The ICO has stated that it will consider regulatory action where it finds non-compliance. With the ICO having power to fine companies up to 4% of their worldwide annual turnover and given that this is an area which is under scrutiny, private companies using LFR or considering its use must ensure that they meet their data protection compliance obligations.

Back to homepage