This is how Apple finds child abuse images in iCloud

While security is a key issue for Apple, it not only applies when it comes to its customers, but goes beyond that and involves protocols to avoid harming innocent people. As confirmed by Apple’s privacy director, during CES 2020 in Las Vegas, the company uses detection technology to search for illegal images.

While Apple did not at the time delve into the algorithms it used to achieve a match that would alert you to such content, a search warrant issued on behalf of Homeland Security Investigations has provided insight into how Apple detects and reports child abuse images uploaded to iCloud or sent through its email servers.

This is how Apple finds child abuse images in iCloud
This is how Apple finds child abuse images in iCloud

According to the information published since 9to5mac the company performs an automated review , similar to the one performed by most technology companies. Each image is identified with a digital signature or hash and revised to match images with a similar pattern.

According to the statement made by Homeland Security Investigations to Forbes, once an image is detected the next step involves communication with the appropriate authority which in the case of the United States is the National Center for Missing and Exploited Children. However, prior to this communication, Apple is known to perform a manual confirmation , and only then provide law enforcement agencies with the name, address, and mobile phone number associated with the corresponding Apple ID.

According to the comments of an Apple employee, belonging to the area that evaluates this type of images, this verification was implemented from the first time an iCloud user uploaded “several images of suspected child pornography” . At the same time, he pointed out that Apple only examines the images when they have been compared with the hash of a known image, so there must be a very low risk of Apple intercepting and viewing innocent images and ensuring protection against a hash error.

This process confirms that Apple is very careful with the personal information of its customers, but it does not stop at the presumption of a crime, so it has a defined process to minimize the risk of false accusations.

Similar Posts