Smartphone
Now Reading
Apple will scan your iPhone and iCloud for child abuse photos
0

Apple will scan your iPhone and iCloud for child abuse photos

by Vyncent ChanAugust 6, 2021
What's your reaction?
Me Gusta
0%
WOW
0%
Potato
0%
Sad Reacc
0%
Angery
0%

Apple is rolling out new software for iOS and iPadOS that will scan your device for child sexual abuse material and inform law enforcement about it. Apple will be performing on-device matching of all the images in your device before it gets uploaded to iCloud, to identify any child sexual abuse material (CSAM).

This is powered by a technology called private set intersection, which checks for matches without revealing the result. Your device then creates a cryptographic safety voucher that contains the match result along with other encrypted information about the image. This gets uploaded to iCloud.

Then another technology called “threshold secret sharing” ensures that Apple cannot access the data in the safety voucher, unless it crosses a threshold of known CSAM content, which will flag the account. Apple will then review the account manually, and if it is confirmed, disable the account and sends a report to the National Center for Missing and Exploited Children (NCMEC). Users can file an appeal if they believe their account is wrongly flagged.

Apple promises that this is entirely secure and that they will not have access to your personal data, unless, of course, you cross the threshold of known CSAM content. If you do not, Apple will not be able to access your data at all.

Apple CSAM Messages

On top of the CSAM detection for iCloud, Apple will also be adding more safety features in the Messages app, where it can warn children and their parents when receiving or sending sexually explicit images. This uses on-device machine learning to analyze image attachments and determine if they are sexually explicit. This is coming to accounts set up as families in iCloud for iOS 15, iPadOS 15 and also macOS Monterey.

Apple CSAM Siri

Last but not least, Siri and Search will be intervening when users search for CSAM-related queries. They will also provide resources to get help. Siri will also be able to provide resources to help children and parents who ask on how to report CSAM or child exploitation.

Overall it does seem like Apple is making a step in the right direction to protect kids. Hopefully Apple really lives up their promise of privacy protection, as the CSAM detection can be quite easily misused when Apple is scanning the images you are receiving and sending in Messages, as well as your iCloud uploads.

Source

Pokdepinion: I guess there’s nothing to worry about if you aren’t in the wrong?

About The Author
Vyncent Chan
Technology enthusiast, casual gamer, pharmacy graduate. Strongly opposes proprietary standards and always on the look out for incredible bang-for-buck.

Leave a Response