The Home Office’s pledge to invest more in facial recognition technology for English police forces will see a live trial at the Notting Hill Carnival in London this weekend, despite concerns from civil liberties groups, including the Open Rights Group, which has claimed that the technology presents ‘unique threats’ to human rights.
If successful in its trial period the recognition systems could be used by police to automatically match people’s faces with biometric data on a database of over 16 million faces – a quarter of the UK’s population.
Current systems labelled facial search are already in use by across the UK, according to a report published by Her Majesty’s Inspector of Constabulary in Scotland (HMICS), early last year. Facial search is less specific than the newer recognition systems, as they are only able to return a list of potential image matches when scanning faces.
‘Lack of transparency’
Civil liberties think-tank the Open Rights Group has condemned the ‘unacceptable lack of transparency’ surrounding experimental recognition tech. In a statement, the Group said: “We do not know, for example, in which public places automated facial recognition is in use; where and how the images captured are stored; how long the images are stored for; how and when the images are deleted; what databases the images are being matched against; … whether any or all of the footage, images, or other data is shared with a third party, … whom the software provider is and how much is paid for the service; how accurate the software is; and whether the software has been tested for accuracy biases.”
Police forces in England and Wales have argued that the newer system will be 95% accurate. However, groups have contested that if this statistic is accurate, 50 errors out of every 1,000 faces scanned is too large a margin of error. It is unclear whether the new recognition tech will make it to Scotland, but the groundwork is there. Police Scotland and its facial search technologies are already connected with the current UK Police National Database (PND), which contains around 12 million faces.
Facial recognition in Scotland
As of August 2017, the ethical implications of bringing recognition technology to Scotland are being considered. An Independent Advisory Group (IAG) has been created to elect on the future of biometrics in crime-fighting north of the border. IAGs advise Police Scotland on topics which are ‘under-represented’ in ‘normal discussions’ of policing. This new group, chaired by John Scott QC, will consider: ‘how biometric data is captured, used, stored and disposed of.’
Mr Scott said: “This is a timely review in an important and fast-developing area. Scottish rules on retention of biometric data have been the subject of positive comment elsewhere, notably from the European Court of Human Rights when it looked at equivalent English rules in 2008. It is appropriate to consider if we are still getting the balance right, especially as there are new types of biometric data being used by our police, courts, and prosecutors.
“I look forward to working with the expert advisory group which includes key individuals, practitioners, and academics, in policing, prosecution, human rights, ethics, and data protection.
“In addition to the use and retention of facial images, we will look at questions which may arise with developing types of biometric data in the hope that we can establish principles informed by relevant ethical and human rights considerations to inform the delicate balancing exercise involved.”
Scotland’s Justice Secretary Michael Matheson said: “At a time when police use of biometric and related technologies is increasing, this work aims to bring certainty to and maintain public confidence in police use of this data to investigate crime and protect the public.
“The group will provide expert advice taking account of the HMICS recommendations on use of facial search technology, making sure we strike the right balance between safeguarding the public and the rights of individuals when we decide how biometric data should be used in future.”
An infringement of rights?
The ORG’s open letter to the Met urged the service to abandon the trials set for this weekend. The ORG claims that the tech is also untested for demographic biases, and that its use at the Notting Hill Carnival – which specifically celebrates the British African Caribbean community – could lead to discriminatory policing.
The ORG said: “If the Met were to repeat use of automated facial recognition at Notting Hill Carnival, this would demonstrate a disregard for democratic scrutiny, a disavowal of the Met’s human rights obligations, and indifference to the serious risk of discrimination posed by this technology.”
“This is not policing by consent. Automated facial recognition is a technology that presents unique threats to human rights and civil liberties. The privacy concerns that this technology gives rise to are on a par with those associated with other forms of biometric surveillance such as DNA databases and ID cards. It is imperative that a serious and meaningful conversation is had involving parliament, civil society and law enforcement regarding plans for this technology before it is ‘trialled or deployed and risks unduly interfering with individuals’ human rights.”
It has also been revealed that armed undercover officers will mingle with the crowd at the Carnival in an effort to prevent terrorism.