Just last month, Apple announced a new CSAM detection feature that would scan your iCloud photos in an attempt to introduce new child safety features. The announcement was a controversial one, and it looks like Apple has put a brake on this rollout as it aims to improve the service before releasing it.
Apple Says That It Has Taken This Decision Based on the Feedback From Various Groups
Apple has provided 9to5Mac with the following statement, saying that it received feedback from various groups, forcing it to put its CSAM detection on hold for the time being.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple’s CSAM detection would have launched as an additional feature of iOS 15, iPadOS 15 and macOS Monterey later this year. With the delay, the company has only said that it will continue to improve the service but has not provided a timeline on when it intends to release the refined variant of CSAM detection. To bring you up to speed, stated below is how the feature would work, according to 9to5Mac.
“Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”
The decision was met with severe criticism from privacy advocates, but Apple remained firm on its decision, saying that this feature would have more privacy than the tech used by giants such as Google and Facebook. With the latest update, Apple has put a brake on its decision and there is no telling when we will see the refined version of CSAM detection. Regardless, we will keep you in the loop, so stay tuned.
News Source: 9to5Mac