New figures have revealed that more than 90% of the 69 million sex abuse images reported by US tech companies come from Facebook.
It is believed that the images are sent through the Facebook Messenger app, to which Facebook says it intends to add end-to-end encryption in the future.
Facebook owner Mark Zuckerberg says that the encryption changes would be designed to improve user privacy across all its platforms.
But fears are growing in the law enforcement community that end-to-end encryption used in this way, such as the system that Facebook-owned WhatsApp currently has in place, lets abuse to go unabated.
End-to-end encryption allows no one apart from the sender and recipient to read or modify messages sent through a platform.
Seven countries, including the US, Australia, New Zealand, Canada, India, Japan and the UK, published an international statement on Sunday (11th October) on end-to-end-encryption and public safety.
The statement says that, although encryption can be a useful tool, it should not be used at the expense of “wholly precluding law enforcement, and the tech industry itself, from being able to act against the most serious illegal content and activity online”.
- Is Facebook doing enough to combat online child grooming?
- Facebook messaging encryption could “threaten lives”, ministers claim
- Police Scotland campaign to help tackle online child abuse
It also states that some uses of encryption “pose significant challenges” to public safety, including to “highly vulnerable members of our societies like sexually exploited children”.
Tech firms in the US made 16.9 million referrals to the National Centre for Missing and Exploited Children (NCMEC) in 2019, including 69 million child abuse images.
Some 94% of the reports, which include the worst category of images, came from Facebook, Home Office officials said.
Robert Jones, the National Crime Agency director responsible for tackling child sexual abuse, commented: “The end-to-end encryption model that’s being proposed takes out of the game one of the most successful ways for us to identify leads, and that layers on more complexity to our investigations, our digital media, our digital forensics, our profiling of individuals and our live intelligence leads, which allow us to identify victims and safeguard them.
“What we risk losing with these content changes is the intelligence leads to pursue those offenders and rescue those children.”
- Brits’ call times increased significantly due to lockdown, report finds
- Comment | Work is changing, it needs a post-pandemic re-think
- Hundredth Scottish SME gets funds from Digital Development Pot
In response, a Facebook spokesperson said: “We have long argued that end-to-end encryption is necessary to protect people’s most private information.
“In all these countries, people prefer end-to-end encrypted messaging on various apps because it keeps their messages safe from hackers, criminals, and foreign interference.
“Facebook has led the industry in developing new ways to prevent, detect, and respond to abuse while maintaining high security, and we will continue to do so.”
In the past, Facebook has made attempts to remove harmful sex abuse images from its platform. In November last year, the firm released figures that revealed 11.6 million pieces of content related to child abuse were taken down between July and September 2019.
But efforts to bolster privacy on Facebook were put into questions after Zuckerberg announced the site’s “pivot to privacy”, or the move to end-end encryption.
Zuckerberg acknowledged that, in some cases, there would be a “trade-off” that would benefit child sex abusers and other criminals.