Rechercher dans ce blog

Monday, August 9, 2021

Apple puts out six-page FAQ on child abuse photo-scanning tech - CNET

depolitikblog.blogspot.com
116-iphone-12-purple-2021
Sarah Tew/CNET

Apple made waves last week when it announced it would be adding a feature in its upcoming software updates that will scan people's iPhones, iPads, Mac computers and Apple Watches for child sex abuse materials in the Photos app. On Monday, the company put out a new document hoping to allay privacy concerns. 

The six-page document, called "Expanded Protections for Children," is a frequently asked questions guide on the forthcoming feature. 

"At Apple, our goal is to create technology that empowers people and enriches their lives," the company writes in its opening overview, "while helping them stay safe. We want to protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)."

After acknowledging that some people have had concerns about how it will do this, the company says it put together the document to "address these questions and provide more clarity and transparency in the process." 

Apple says that the CSAM protection, which scans photos, "is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images." It adds that even possessing those images is "illegal" in most countries, including the US. 

The company adds that the feature will only impact those who use iCloud Photos to store their pictures and "does not impact users who have not chosen to use iCloud Photos." 

Apple says that the feature will not have any "impact to any other on-device data" and that it "does not apply to Messages." It also stresses that it will refuse any demands from governments looking to expand the feature to include non-CSAM images. 

"We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future," the company writes. 

As for the accuracy of properly identifying people, Apple says that "the likelihood that the system would incorrectly flag any given account is less than one in one trillion per year," with the company conducting a "human review" before sending any report to the National Center for Missing and Exploited Children. Apple concludes that "system errors or attacks will not result in innocent people being reported to NCMEC."

Adblock test (Why?)



"six" - Google News
August 09, 2021 at 10:18PM
https://ift.tt/3lKWYkC

Apple puts out six-page FAQ on child abuse photo-scanning tech - CNET
"six" - Google News
https://ift.tt/3dcBbL9
https://ift.tt/2Wis8la

No comments:

Post a Comment

Search

Featured Post

Granblue Fantasy: Relink's Demo Will Make a Believer Out of You - Kotaku

depolitikblog.blogspot.com Before multiple friends of mine went out of their way to sing the praises of Granblue Fantasy: Relink to ...

Postingan Populer