Skip to content

Apple automatically analyzes photos uploaded to iCloud for possible child sexual abuse

The Telegraph

Photos in iCloud are encrypted and no one can access them except the user who owns the iCloud account. However, that doesn’t mean it’s a private site for any kind of content. An Apple automated system analyzes the uploaded content to find possible child sexual abuse in the photos stored in iCloud. Everything points to PhotoDNA being used for this, a Microsoft system used by other major technologies for the same purpose.

Apple automatically analyzes photos uploaded to iCloud for possible child sexual abuseApple automatically analyzes photos uploaded to iCloud for possible child sexual abuse

According to the CES 2020, Jane Hovarth, Apple’s Privacy Director, scans images uploaded to iCloud from devices such as the iPhone or iPad to check for possible child abuse . The director also spoke about privacy issues related to access requested by the FBI.

You have not gone into too much detail about how they perform this check, you have only indicated that they use certain technologies to do the check automatically:

Other major technologies such as Facebook or Google are known to use PhotoDNA , an automatic detection system that is capable of detecting images of minors with sexual content. Apple may also use PhotoDNA, although there is nothing official about it.

What Jane Hovart has emphasized is the fact that user privacy is still important (in fact last year she made one of her most relevant campaigns in this sense precisely at CES) and end-to-end encryption is always present. This end-to-end encryption means that only the user himself and his devices can access the content, since for the rest of the time it is blocked and the unlock keys are only found on the user’s end devices. According to Jane:

The truth is that there should be no surprise here, although Apple does not indicate since when they scan the images for child pornography, it is in their terms of use:

In AppleThe FBI asks again for access to an iPhone and Apple answers that “back doors” are not the solution

The important thing here is to continue to respect the privacy of the user , to be able to find illegal content without violating the privacy and security of all users is certainly not easy, but it seems that Apple has a system for this. Which, at least until now, has worked.

Via