Facial Recognition Tech is “Not Yet Fit for Purpose”

Automated Facial Recognition

The deployment of facial technology software on British streets is once again raising concerns over privacy rights, with statistics suggesting that the intrusive tech isn’t as effective as police believe it to be. 

A freedom of information request has shown that facial recognition software deployed by the Metropolitan Police Force produces an alarming number of false positives.

The Met Police Force’s systems produced 104 alerts based on facial recognition scans, of which only two were confirmed to be positive matches – This equates to a false positive rate of 98%. Britain’s biometrics commissioner has deemed the technology “not yet fit for purpose,” and a number of privacy watchdogs have called for the deployment of the technology to be halted.

The Met Police insisted that it did not consider inaccurate matches to be false positives as alerts – which occur when recognition software detects a ‘positive’ ID – were checked a second time.

False Positives

False positives are where facial recognition software incorrectly matches a person against a police watch list. Operators consider the initial alert and are then faced with a decision; disregard the alert (which police say happens on a majority of cases) or dispatch an intervention team to intercept the person(s). Officers on the ground are then in a position to establish if the match is correct or incorrect.

If an incorrect match has been made, officers are then expected to explain the situation to the individual and provide them with a “Fair Processing Notice”, as well as allowing them to see the equipment.

According to South Wales Police “no facial recognition system is 100% accurate under all conditions,” and that “technical issues are normal to all face recognition systems which means false positives will continue to be a common problem for the foreseeable future.”

It is this that privacy rights watchdogs and campaigners take issue with; the use of a technology not yet fully developed being deployed on British streets and at public events.

Controversial Deployments

The deployment of facial recognition software by UK police services has been met with criticism from privacy rights groups, as well as coming under intense scrutiny in the House of Lords. The software is currently being trialled by the Metropolitan and South Wales Police forces, and has been used at large public events such as the Notting Hill Carnival, Remembrance Day services and at the 2017 Champions League Final in Cardiff.

South Wales Police has been leading trials of facial recognition software in the UK, having deployed the technology 15 times since June 2017 and returning more than 2,400 false positives throughout that period. The police force claims the majority of those came during the Champions League Final and fewer than 10% of over 230 alerts were positives matches.

According to South Wales Police, the use of automated facial recognition technology, Identify, has resulted in 2,000 positive matches and over 450 arrests in the past nine months. Successful convictions include six years in prison for robbery and four and a half years imprisonment for burglary. Another system operated by South Wales Police, Locate, has helped police arrest 24 people since June 2017 – This real-time software is the primary source of the false positives figures.

While these statistics suggest the technology is having positive results, the issue of oversight and regulation is still being raised. Big Brother Watch has called for trials of facial recognition technology to be halted, while in March members of the House of Lords slammed these deployments as a step too far for a nation already under heavy observation.

A number of peers claimed the government has neither the policy ideas nor the legislation in place to implement facial recognition technology in a manner which respects the privacy rights of the British public. South Wales Police say that privacy rights are taken into account when deploying technology such as this, and that the necessary checks and balances are in place to avoid misuse or abuse.

On its website, South Wales Police said:

“Throughout the trial South Wales Police has been very cognisant of concerns about privacy and we have built in checks and balances into our methodology to make sure our approach is justified and balanced. We have had detailed discussions and consultation with all interested regulatory partners.”



Latest News

GDPR
25th May 2018

DIGIT’S 20 Best GDPR Memes

GDPR
25th May 2018

The DIGIT Cut-Out-and-Keep Guide to GDPR

Hardware Opinion Review
25th May 2018

DIGIT Product Review: Jabra Evolve 75e Headphones

Business News