Here’s why Apple’s new child safety updates are so controversial

In a briefing last week, Apple announced three changes that will be released later this year to protect children from sexual abuse. The changes will target different applications with different feature sets, within iOS, macOS, watchOS and iMessage that will detect potential child abuse imagery. The feature will first roll out in the US to help limit the spread of CSAM [child sexual abuse material] online, while designing for user privacy.

Siri does more than ever and will offer resources on how to report child abuse. iMessage is no longer a secure message tunnel and will flag nudes sent or received by children under the age of 13 and alert their parents. In addition, images that are backed up by iCloud photos will be matched against the database, child sexual abuse material (CSAM). And if more than a certain number of images match, it will be reported to the National Center for Missing and Exploited Children (NCMEC).

Here’s why Apple’s new child safety features are so controversial

 

            Although Apple claims that the application is designed to be a much more private process that involves scanning images on your phone, security researchers and privacy advocates have raised several concerns. Apple’s new technology will basically allow the iPhone’s operating system to look at your photos and match them up against database of illegal content. And the inability to remove the capability is a major reversal for Apple, especially because five years ago they refused the FBI’s request to unlock a phone and put up a billboard stating, “what happens on your iPhone stays on your iPhone”. The fact that Apple created a system that can proactively access your images for illegal material and refer them to law enforcement agencies is contradicting their own statement. It also opens avenues for law enforcement agencies or governments. Right now, it is all about child safety, but what happens when Apple challenged by in Europe, China, or the U.S. to expand what it looks for?

Related Posts