NSPCC Urges Facebook Not to Encrypt Children’s Messages
Facebook would “lose the ability” to detect grooming on its site if it encrypts children’s messenger accounts, says NSPCC.
The NSPCC, one of the UK’s leading children’s charities, has warned that Facebook’s plan to encrypt its messaging services will put children at greater risk of being groomed by paedophiles.
The charity said the changes should not go ahead unless the tech giant can prove the move will not put children in danger.
According to the NSPCC, Facebook would be breaking new statutory duty of care regulations that the Government is looking to impose on global tech firms. Speaking to the Telegraph, Professor Harry Farid, the inventor of new software able to detect and block millions of abuse images, said Facebook’s encryption plans were “spectacularly harmful for children” and “morally reprehensible”.
Facebook’s decision to encrypt all its messaging services, which was announced in April, was met with criticism due to safety implications. Facebook boss Mark Zuckerberg said that it “prevents anyone, including even us, from being able to see what you share on our services”.
- Security Expert Becomes UK’s First Ever White Hat Hacker Millionaire
- How Technology is Transforming Cultural Heritage Preservation
- Selfies Could Be Ageing You, Researchers Warn
Home Secretary Priti Patel said the company’s encryption plans could seriously hinder any efforts to purge social media of child abuse and terrorist content. Earlier this year, the Government outlined plans in a white paper to impose a legal duty of care on tech companies to protect their users from harm.
In response to the white paper, the charity said the duty of care measure could be “feasibly met” if messages to and from children’s accounts were not encrypted.
Andy Burrows, the NSPCC’s head of child safety online policy, said: “You absolutely lose the ability to detect what is going on on your site (with encryption) and we know from our own research that Facebook and Facebook-owned apps are the predominant sites where (grooming) offences are being recorded.
“It seems curious at best that a platform, looking at the extent of risks it has on its site, would choose to do something that will at best actively frustrate the ability to see what is going on and protect child users.
“That’s why we have called in the white paper response for children and young people’s accounts not to be encrypted unless and until a platform can demonstrate that in moving to encryption it doesn’t compromise the duty of care.”
PhotoDNA, which was developed by Farid, assigns every known abuse image a unique signature to prevent it from being re-uploaded. It was developed with Microsoft in 2008 before it was released to other companies without a required license and has become an industry standard.
Farid said a roll out of encryption across all of Facebook’s messenger services could create a “safe harbour” for paedophiles to trade child abuse images. “It will kill it (PhotoDNA) in its tracks,” he said. “There is no question that this is spectacularly harmful to children and dangerous.
“I find it morally bankrupt that knowing the problem that you have and knowing there is a technological solution and then for a $67 billion company to say ‘it’s not my problem’. This is morally reprehensible and we should not allow this to happen.”
Facebook said that keeping young people safe on its services was its “top priority”. A spokesman for tech giant said: “In addition to using technology to proactively detect grooming and prevent child sexual exploitation on our platform, we work with child protection experts, including specialist law enforcement teams like CEOP in the UK, to keep young people safe.”