The European Commission has proposed controversial new regulation that would require chat apps like WhatsApp and Facebook Messenger to selectively scan users’ private messages for child sexual abuse material (CSAM) and “grooming” behavior. The proposal is similar to plans mooted by Apple last year but, say critics, much more invasive.
After a draft of the regulation leaked earlier this week, privacy experts condemned it in the strongest terms. “This document is the most terrifying thing I’ve ever seen,” tweeted cryptography professor Matthew Green. “It describes the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR. Not an exaggeration.”
Jan Penfrat of digital advocacy group European Digital Rights (EDRi) echoed the concern, saying, “This looks like a shameful general #surveillance law entirely unfitting for any free democracy.” (A comparison of the PDFs shows differences between the leaked draft and final proposal are cosmetic only.)
The regulation would establish a number of new obligations for “online service providers” — a broad category that includes app stores, hosting companies, and any provider of “interpersonal communications service.”
The most extreme obligations would apply to communications services like WhatsApp, Signal, and Facebook Messenger. If a company in this group receives a “detection order” from the EU they would be required to scan select users’ messages to look for known child sexual abuse material as well as previously unseen CSAM and any messages that may constitute “grooming” or the “solicitation of children.” These last two categories of content would require the use of machine vision tools and AI systems to analyze the context of pictures and text messages.
“The proposal creates the possibility for [the orders] to be targeted but doesn’t require it,” Ella Jakubowska, a policy advisor at EDRi, told The Verge. “It completely leaves the door open for much more generalized surveillance.”