Apple The company (Apple) has announced that it will scan all iPhones in the United States for the detection of Child Sexual Abuse Material (CSAM). As part of this effort, Apple is partnering with the government to make changes to iCloud, iMessage, Siri, and Search. Will the changes are done by apple scan every image on your Phone?
To this end, Apple will introduce a feature to detect and report child sexual abuse images on iOS devices such as the iPhone and iPad. Apple has taken such action to protect minors and control the spread of child sexual abuse images.
However, security experts say such a feature could lead to surveillance and data leakage risks, and some e-rights organizations have complained that the new feature could be misused by governments and other organizations.
When users upload images to Apple’s iCloud, they compare them to sexual harassment images (NCMEC database) guided by child protection systems. Apple doesn’t do this in the cloud, it does these things on your iPhone. Before sending an image to the iCloud storage, this algorithm checks and tests it with known CSAM hashes (CSAM hashes).
Note that scanning will not work if iCloud sync off / sync is not connected on your iPhone.
For iMessage, Apple does a scan and blurs CSAM images. In addition, when a child sees such a picture, parents will be notified about it so they can take appropriate action.
Related Post: App That Collects Your Data for Their Benefit in 2021
Apple scan every image: iPhone Scan for Child Abuse
If a child tries to send such a picture, they will be warned and if they go above, a notice will be sent to the parents. It should be noted that the notice will be sent to the parents only if the child is under 13 years of age. Young people aged 13-17 will receive an alert notification only on their iPhones.
Apple will report such images to the National Center for Missing and Abused Children in the United States.
But Apple has made it clear that steps are being taken to protect users’ privacy.