Apple Backs Down on Its Controversial Photo-Scanning Plans

In August, Apple detailed several new features intended to stop the dissemination of child sexual abuse materials. The backlash from cryptographers to privacy advocates to Edward Snowden himself was near-instantaneous, largely tied to Apple’s decision not only to scan iCloud photos for CSAM, but to also check for matches on your iPhone or iPad. After weeks of sustained outcry, Apple is standing down. At least for now.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” the company said in statement Friday. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple didn’t give any more guidance on what form those improvements might take, or how that input process might work. But privacy advocates and security researchers are cautiously optimistic about the pause.

“I think this is a smart move by Apple,” says Alex Stamos, former chief security officer at Facebook and cofounder of cybersecurity consulting firm Krebs Stamos Group. “There is an incredibly complicated set of trade-offs involved in this problem and it was highly unlikely that Apple was going to figure out an optimal solution without listening to a wide variety of equities.”

find
find more
find more info
find more information
find out here
find out here now
find out more
find out this here
for beginners
from this source
full article
full report
funny postget more
get more info
get more information
get redirected here
get the facts
go
go here
go now
go right here
go to the website
go to these guys
go to this site
go to this web-site
go to this website
go to website
go!!
going here
good
great post to read
great site
had me going
have a peek at these guys
have a peek at this site
have a peek at this web-site
have a peek at this website
have a peek here
he has a good point
he said
helpful hints
helpful resources
helpful site
her comment is here
her explanation
her latest blog
her response
here
here are the findings
here.
his comment is here
his explanation
his response
home
home page
homepage
hop over to here
hop over to these guys
hop over to this site
hop over to this web-site
hop over to this website
how much is yours worth?
how you can help
i loved this
i thought about this
i was reading this
image source
in the know
index
informative post
inquiry
internet
investigate this sitekiller deal
knowing it
learn here
learn more
learn more here
learn the facts here now
learn this here now
like it

CSAM scanners work by generating cryptographic “hashes” of known abusive images—a sort of digital signature—and then combing through huge quantities of data for matches. Lots of companies already do some form of this, including Apple for iCloud Mail. But in its plans to extend that scanning to iCloud photos, the company proposed taking the additional step of checking those hashes on your device, as well, if you have an iCloud account.

The introduction of that ability to compare images on your phone against a set of known CSAM hashes—provided by the National Center for Missing and Exploited Children—immediately raised concerns that the tool could someday be put to other use. “Apple would have deployed to everyone’s phone a CSAM-scanning feature that governments could, and would, subvert into a surveillance tool to make Apple search people’s phones for other material as well,” says Riana Pfefferkorn, research scholar at the Stanford Internet Observatory.

Apple has resisted multiple United States government requests to build a tool that would allow law enforcement to unlock and decrypt iOS devices in the past. But the company has also made concessions to countries like China, where customer data lives on state-owned servers. At a time when legislators around the world have ramped up efforts to undermine encryption more broadly, the introduction of the CSAM tool felt especially fraught.

“They clearly feel this is politically challenging, which I think shows how untenable their ‘Apple will always refuse government pressure’ position is,” says Johns Hopkins University cryptographer Matthew Green. “If they feel they must scan, they should scan unencrypted files on their servers,” which is the standard practice for other companies, like Facebook, which regularly scan for not only CSAM but also terroristic and other disallowed content types. Green also suggests that Apple should make iCloud storage end-to-end encrypted, so that it can’t view those images even if it wanted to.

Leave a Reply

Your email address will not be published. Required fields are marked *