Site navigation

Invasive Council Algorithms “Discriminating Against Britain’s Poor”

Ross Kelly

,

council algorithms
Privacy rights campaigners have warned that councils are using invasive software to “mass profile” Brits claiming benefits.

Research published by Big Brother Watch shows that councils across Britain are conducting mass profiling of welfare and social care recipients to “predict” certain behaviours.

According to the privacy rights group, the profiling aims to predict the likelihood of benefit fraud, non-payment of rent and even major life events.

Findings in the report are the result of a long-term investigation which involved over 2,000 Freedom of Information requests to local authorities in Scotland, England and Wales.

The group said that councils often use automated tools without the knowledge of residents and that most of the algorithms uncovered were “unevidenced, incredibly invasive and likely discriminatory”.

“The scale of the profiling, mass data gathering and digital surveillance that millions of people are unwittingly subjected to is truly shocking,” said Jake Hurfurt, head of research and investigations at Big Brother Watch.

“We are deeply concerned that these risk scoring algorithms could be disadvantaging and discriminating against Britain’s poor.”

Invasive council algorithms

Research conducted by the privacy group found an algorithm by tech company, Xantura, is used by two London councils.

The company claims its Covid OneView system can predict how the coronavirus pandemic may negatively impact residents or even if they were likely to break self-isolation rules.

This system is built on thousands of pieces of data held by UK councils, including personal information pertaining to people’s sex lives, anger management issues or if they possess a dog deemed dangerous.

Similarly, algorithms intended to assign “fraud risks scores” to benefit claimants are used by over 50 councils across the country. More than half-a-million applicants were processed before being able to access housing benefits or council tax support, research found.

Documents obtained by Big Brother Watch found that 30 local authorities have ceased using fraud risk tools in the past four years. This was largely due to the software failing to meet expectations.

Automated benefit cuts

Up to three million housing benefits claimants are profiled by the DWP, the report found, with algorithms predicting which claimants are most likely to require payment cuts.

In fact, the algorithms are set targets to assign up to one-quarter as ‘medium risk’ and 20% as ‘high risk’ by the DWP.

Sarah Willocks, head of external affairs at Turn2Us, said the research highlighted an “incredibly concerning trend” where people on low incomes are treated with “suspicion, bias and discrimination”.

“A decade of cuts, caps and freezes to our social security system has left cash-strapped councils relying on outsourced algorithms,” she said.

“We urge both the DWP and local authorities to review the findings of this report and ask themselves whether this is an ethical or even practical way to go about their work,” Willocks added.

‘Future criminality’

Perhaps the most concerning details revealed by Big Brother Watch include the use of automated software to assess the risk of “future criminality” among schoolchildren.

The London Borough of Hillingdon’s ‘Project AXIS’ gathered data from police, schools, social care, missing persons, care homes and even social media without residents’ knowledge.

Thereafter, the council profiled schoolchildren based on a range of variables. The council claimed “no piece of information is too small” for the database.

Project AXIS bears similarities to the highly controversial Gangs Matrix database run by the Metropolitan Police service.

An investigation by the Information Commissioner’s Office found the database was operated unlawfully and held data on people who were unconnected to any gang activity. The database also disproportionately profiled younger black men, the ICO ruled.


Recommended


Big Brother Watch has called for greater transparency over the use of algorithms, insisting that “secretive systems of digital suspicion should not be used behind closed doors”.

In particular, campaigners have recommended the establishment of a public register of algorithms which authorities can draw upon to conduct privacy and equality assessments before using predictive tools.

This could mitigate the risks of discrimination and the impact upon residents. Assessments such as these, the group found, were rarely conducted.

Lord Clement-Jones, chair of the Select Committee on Artificial Intelligence said the use of algorithmic decision-making systems is “deeply alarming” and echoed the group’s calls for greater transparency.

“The Government, as Big Brother Watch recommends, need to urgently strengthen regulation over these algorithmic systems, introduce a public register and assert the ethical values of transparency, privacy, freedom from bias and accountability that should govern their use,” he said.

Ross Kelly

Staff Writer

Latest News

Business Machine Learning
Cybersecurity Data Protection
%d bloggers like this: