Site navigation

Apple Admits Scanning Photos Uploaded to iCloud

Dominique Adams

,

Apple

The tech giant has confirmed that it automatically scans people’s images for illegal activities such as child sexual abuse. 

Speaking at this year’s Consumer Electronics Show (CES) 2020 conference in Las Vegas, Apple’s senior director of global privacy Jane Horvath admitted that the firm scans images uploaded to the iCloud.

Horvath explained that it did so to check whether they contained illegal images such as child sexual abuse material.

The move comes as technology companies are coming under increasing pressure to do more to tackle crime. Apple in particular has come under scrutiny for its refusal in the past to break encryption on the devices of suspected and confirmed criminals.

Apple has also highlighted its commitment to protecting its customers’ privacy, which has been one of its key market differentiators.

Removing encryption was “not the way we’re solving these issues”, Horvath said. “We are utilising some technologies to help screen for child sexual abuse material,” she added.

She defended the company’s decision to encrypt iPhones, although it makes it harder for law enforcement to unlock them, saying: “End to end encryption is critically important to the services we come to rely on…health data, payment data.

“Phones are relatively small – they get lost and stolen. We need to make sure that if you misplace that device you’re not exposing that data.”

Recommended 

The disclaimer on Apple’s website concerning the scanning of images reads: “Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space.

“As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation.

“Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.”

The company did not elaborate on what technology it was using to screen the images, however, other tech firms such as Google, Twitter and Facebook use a system called called PhotoDNA. It has been speculated Apple may be using a similar system.

PhotoDNA works by checking scanned images against a database of previously identified illegal photos using hashing.

Dominique Profile Picture

Dominique Adams

Staff Writer, DIGIT

Latest News

%d bloggers like this: