Keep an eye out for xenophobic, privacy-breaching facial recognition technology

09 Dec 2020, 41 mins ago

Automated Facial Recognition (“AFR”) has become an increasingly contentious and worrisome technological feature of today’s globalised information environment. There have, however, been some encouraging signs of attempts to reign in the deployment of AFR, and some successful outcomes to litigation aimed at curbing its implementation.

Last week saw the tabling of a ban of facial recognition by public agencies in Massachusetts, US. Should the bill be signed into law, Massachusetts will become the first US state to pass such a law, joining a list of US cities which have already implemented similar bans. This would not be a blanket ban on AFR, however. The legislation being considered in Massachusetts would require the local police to obtain a warrant before running citizens’ faces through a database. This represents an objective and fair ‘line in the sand’ which many moderates would arguably agree to endorse.

Critics of AFR have applauded this proposed legislation. The philosophy which drives criticism of AFR falls broadly into two camps: firstly the fear that its implementation will be utilised to stifle civil liberties, and secondly the argument that the technology is inherently discriminatory, and that its implementation could be both racist and sexist. The concerns voiced by critics of AFR in many ways replicate the arguments considered and upheld by the Court of Appeal earlier this year in its judgment in the case of R (on the application of Edward Bridges) v The Chief Constable of South Wales Police (where the Secretary of State for the Home Department was an interested party).  

The Court considered and endorsed the argument that the use of AFR by the Welsh police breached Article 8 of the European Convention on Human Rights (“ECHR”). Article 8 of the ECHR provides for a right to respect for one’s private and family life. The Court held that AFR, like fingerprints and DNA, “enables the extraction of unique information and identifiers about an individual allowing his or her identification with precision in a wide range of circumstances”. The Court ruled that given the private nature of such information, consent was essential before any such information could be extracted. Despite attempts by the Welsh police to inform individuals that AFR was in place in certain areas and at specific times, the Court of Appeal deemed these attempts inadequate, given that there was no way that everyone potentially affected could have been properly informed, nor could they have given adequate consent to the capture of the private information the police were collecting.

The Court also considered the bias carried by AFR technology. The technology is unable to accurately identify faces belonging to a number of ethnic groups and has also been documented as failing to accurately identify women. The xenophobic nature of AFR has led many activists to criticise and warn against its implementation. Given its inherent bias and following the proven discriminatory nature of predictive policing, AFR must be treated with extreme caution.

In the US, civil rights groups are fighting back against a government that has already recorded the facial features of over 117 million of the population. Here in the UK, civil rights groups are fighting against a Home Office which is sponsoring and seeking to implement AFR technology. With profit-hungry tech corporations keen to sell data they hold on their users and a Home Office keen to implement AFR, it is more important than ever that the fight against this discriminatory and anti-privacy technology is brought before the courts and drawn to the public’s attention. 

 

The information in this blog is for general information purposes only and does not purport to be comprehensive or to provide legal advice. Whilst every effort is made to ensure the information and law is current as of the date of publication it should be stressed that, due to the passage of time, this does not necessarily reflect the present legal position. Gherson accepts no responsibility for loss which may arise from accessing or reliance on information contained in this blog. For formal advice on the current law please don’t hesitate to contact Gherson. Legal advice is only provided pursuant to a written agreement, identified as such, and signed by the client and by or on behalf of Gherson.

©Gherson 2020