"It's about the children"
-
Apple will secretly scan your photos.
Shortly after reports today that Apple will start scanning iPhones for child-abuse images, the company confirmed its plan and provided details in a news release and technical summary.
"Apple's method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind," Apple's announcement said. "Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users' devices."
Apple provided more detail on the CSAM detection system in a technical summary and said its system uses a threshold "set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."
The changes will roll out "later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey," Apple said. Apple will also deploy software that can analyze images in the Messages application for a new system that will "warn children and their parents when receiving or sending sexually explicit photos."
-
Since the system ultimately relies on an external database to provide references for what should be considered offending materials, one can imagine a state power manipulating that external database to influence what gets flagged by this system.
And “one in a trillion” false positive rate sounds like a way of saying “we use 40-bit hashes,” which is neither impressive nor reassuring as far as accuracy goes.
-
The Wall Street Journal does an “interview” with Apple software chief Craig Federighi to “clarify” those new features about Apple attempting to detect child pornography on devices (embedded video):
https://www.wsj.com/video/series/joanna-stern-personal-technology/apples-software-chief-explains-misunderstood-iphone-child-protection-features-exclusive/573D76B3-5ACF-4C87-ACE1-E99CECEFA82CI put “interview” and “clarify” in quotes because I don’t think the WSJ asks tough enough questions to really sketch out the risk of what Apple is doing that can harm consumer privacy. I would much rather have Electronic Freedom Frontier lawyer and engineer hash it out with Apple’s system architects and engineers.