Apple on Friday said it intends to delay the introduction of its plan to commandeer customers’ own devices to scan their iCloud-bound photos for illegal child exploitation imagery, a concession to the broad backlash that followed from the initiative.
“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material,” the company said in a statement posted to its child safety webpage.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple – rather than actually engaging with the security community and the public – published a list of Frequently Asked Questions and responses to address the concern that censorious governments will demand access to the CSAM scanning system to look for politically objectionable images.
“Could governments force Apple to add non-CSAM images to the hash list?” the company asked in its interview of itself, and then responded, “No. Apple would refuse such demands and our system has been designed to prevent that from happening.”
Apple however has not refused government demands in China with regard to VPNs or censorship. Nor has it refused government demands in Russia, with regard to its 2019 law requiring pre-installed Russian apps.
Tech companies uniformly say they comply with all local laws. So if China, Russia, or the US were to pass a law requiring on-device scanning to be adapted to address “national security concerns” or some other plausible cause, Apple’s choice would be to comply or face the consequences – it would no longer be able to say, “We can’t do on-device scanning.”