Facial Recognition Technology: Dystopia or Hysteria?

UK Facial Recognition

The use of facial recognition technology is subject to intense debate, with civil rights groups branding the technology intrusive and dangerous. 

Earlier this month, civil rights groups heavily criticised the Met Police service over a facial recognition pilot scheme on the streets of London.

This trial, advocates said, was deeply disturbing and intrusive to passers-by. While signage was posted in the vicinity of the vehicle from which the technology was operating, some members of the public appeared concerned over its presence on a British street. One such passer-by even went so far as to cover his face and was handed a £90 fine following a heated confrontation with police on the scene.

Writing for TIME this month, Big Brother Watch Director, Silkie Carlo, claimed the UK is “adopting surveillance technologies in a style more typical of China than of the West”.

Carlo added: “For centuries, the UK and US have entrenched protections for citizens from arbitrary state interference – we expect the state to identify itself to us, not us to them. We expect state agencies to show a warrant if our privacy is to be invaded. But with live facial recognition, these standards are being surreptitiously occluded under the banner of technological ‘innovation'”.

Bold comments, undeniably. But are privacy rights groups justified in their conflation between the UK and China’s adoption of facial recognition technology?

China’s adoption of this tech is on a scale quite unparalleled anywhere else on Earth. It is believed that Chinese authorities have around 200 million facial recognition cameras littered around the country, targeting citizens for even minor offences.

Furthermore, police services in China have begun the rollout of glasses fitted with the technology. A 2018 report by Reuters revealed that the glasses had been used by police to catch people travelling under false identities. The success of this initial pilot scheme led to an expansion of the trials to police forces operating in and around Beijing.

In the UK, the rollout of facial recognition technology hasn’t quite reached this scale. However, privacy rights advocates insist that the slippery slope of adoption is a critical risk to privacy, as well as the fundamental pillars of British democracy.

Opponents frequently lambaste the technology, dubbing it ‘Orwellian’, ‘dystopian’ and employing all the common hyperbole that surrounds the emergence of new technologies; especially where government or police are involved. Meanwhile, law enforcement suggests it could be a crucial tool for fighting crime in years to come. The Home Office has previously said that facial recognition is an “invaluable tool” in assisting police and intelligence services.

Whichever side of the argument one resides on, the reality of the situation is that facial recognition technology is here to stay and the debate will continue to permeate the airwaves. The questions that are being asked over the legality of the technology’s use, however, could define how society functions in years to come.

This technology, privacy rights groups argue, is poorly regulated and requires greater scrutiny and testing. Technology researcher and broadcaster, Dr Stephanie Hare, believes the lack of a clear-cut ethical framework is harming citizens both in the UK and further afield.

The fundamental problems with this tech are its intrusive nature, the scale of data being gathered and the rapid evolution of the underlying technology fuelling this growth – evolution which is largely devoid of oversight, regulation and scrutiny.

“Facial recognition technology is not just about scanning your face. It is about linking you to all other data about you,” Hare explains. “Data held about you by the state, and the data held about you by data brokers and companies.”

This is incredibly powerful and invasive,” she continues. “Think about it: the police must get a warrant to search our homes, our businesses, our vehicles and our devices, precisely because this is such an exceptional intrusion of our privacy. Our biometrics are even more private data and they deserve protections that are articulated and codified in law.”

Hare asserts that the UK has a “long and rich tradition” of protecting human rights and civil liberties, and in regards to facial recognition technology – and biometrics as a whole – has an opportunity to lead the debate and discussion surrounding this technology.

Increasing Activity

Facial recognition technology isn’t a new technology suddenly erupting onto the scene and causing havoc in public life. It has been deployed at the Notting Hill Carnival on a host of occasions, much to the malign of privacy rights activists. At the 2017 Notting Hill Carnival, the Met Police’s system had a staggering false-positive rate in the region of 98%. This resulted in more than 100 occasions in which officers wrongly believed the system had spotted a suspected criminal.

When Cardiff hosted the Champions League final in 2017, South Wales Police (SWP) deployed automated facial recognition technology. The police service was forced to defend its use of facial recognition technology after it was revealed that more than 2,000 people were wrongly identified as potential criminals.

These repeated failures, Hare asserts, will continue to hurt relations between the public and police services. Relations could be further strained when one considers the impact upon people of colour. It has been found that facial recognition is more likely to misidentify women and people of colour.

“The United States and the United Kingdom have demonstrated repeatedly that facial recognition technology is best at identifying white men, who coincidentally make up the demographic of people who design this technology. It is worst at identifying people of colour – especially women of colour,” she says.

“It is unacceptable to use technology on the whole population that does not work – particularly for something that is so important as identifying someone as a suspected criminal.”

This blanket surveillance has prompted a backlash from citizens, both in the US and the UK. Last week, a landmark court case launched by a man in Wales aims to challenge the use of facial recognition tech in Cardiff. Ed Bridges crowdfunded action against SWP over allegations that its use of the technology on him was unlawful and violated his privacy; a claim strongly supported by civil rights group, Liberty.

“The police started using this technology against me and thousands of other people in my area without warning or consultation. It’s hard to see how the police could possibly justify such a disproportionate use of such an intrusive surveillance tool like this,” Bridges commented.

Rights group Liberty, which represents Bridges, claims that police have scanned his face at least twice; once while Christmas shopping and again during an anti-arms protest.

These concerns aren’t limited to the UK, either. Last week, the city of San Francisco announced that the use of this technology was prohibited. The city’s board of supervisors, which acts as the legislative department for the city government, banned the use of facial recognition on the grounds that it is highly intrusive to local residents and could pose a danger to democratic freedoms.

Matthew Rice, Scotland Director at Open Rights Group, believes this move could set a precedent for cities around the world. At the very least, this ban illustrates different routes that are being taken on the topic and, crucially, is that it’s not a blanket ban of sorts.

“What’s really interesting is that they’re not saying we can never use facial recognition,” he says. “What they’re saying is that there is a narrow set of circumstances in which it is appropriate.”

“To me, it’s similar to communications surveillance and capturing messages. The way that is structured is that it’s illegal to intercept someone’s communications unless it’s for these very particular purposes and following strict processes,” he explains.

“I think that could be the path of articulating how you can use facial recognition. You cannot just put it on because it’s useful, it has to follow these very particular processes and clearly articulate why it’s being used and for what purpose,” Rice adds.

Technologies that work

During an era of rapid technological evolution, police services across the UK are assessing the value of emerging technologies to enhance their capabilities.  This rapid change, Hare suggests, has been compounded by increased pressure on police services from the government which has, in part, convoluted the ethical debate.

“The police have an incredibly challenging job, particularly here in the UK where over a decade of government-led austerity policies have seen cuts to the police in terms of officers and resources,” she says. “They must protect us against terrorism in the real world and online.”

Rice echoes her thoughts regarding police adoption of emerging technologies and suggests that evolving criminal activities are also fuelling the unfolding race to the top.

“It’s clear that the police are trying to keep up and stay on top of new crimes, or new ways that crimes can be recorded,” he says. “They’re trying to stay on top of these things so they don’t fall behind.”

Police should be equipped with tools and technologies that work, Hare says. Tools that can be used with confidence and will help to prevent and prosecute crime. Facial recognition technology is not the silver bullet to the lingering problems police services across the country are confronted with.

“Giving them facial recognition technology that misidentifies people at such alarmingly high rate, and which has resulted in tens of thousands of innocent people having their facial images stored in the police national database alongside those of convicted criminals, will undermine the public’s trust in the police,” she says.

“That’s unacceptable, and it’s also really unfair to the police – we would not send them to work with bulletproof vests that did not function properly or other inadequate tools,” Hare adds, noting that the same considerations should be made when discussing the deployment of facial recognition or other emerging technologies.



Latest News

Emerging Tech News
News Technology
News Robotics Technology
News Technology
18th October 2019

DIGIT Tech News Roundup: 18th of October 2019