• Tue. Jun 21st, 2022

European Commission presents child safety plan

ByChad J. Johnson

May 11, 2022

The European Commission unveiled a new plan to tackle child sexual exploitation material on Wednesday, and it is already drawing a backlash from privacy experts who say it would create an invasive new surveillance regime in Europe.


The proposal would require tech companies in Europe to scan their platforms and products for CSAM and report their findings to law enforcement. Many tech companies already do this in one form or another, of course, using hashed versions of CSAMs known to automatically block new downloads matching that content. But the European plan would go further in this work, allowing EU countries to ask the courts to require tech companies to research and report new cases of CSAM. The plan also proposes using AI to detect language patterns associated with grooming.

“We are failing to protect children today,” EU Home Affairs Commissioner Ylva Johansson told a news conference.

But critics say these requirements would risk breaking end-to-end encryption and force companies to scrutinize all users’ personal communications. “This document is the most terrifying thing I have ever seen,” said Matthew Green, associate professor at the Johns Hopkins Institute for Information Security. tweeted after a leaked draft proposal. “Once you open ‘machines reading your text messages’ for any purpose, there are no limits.”

“Today is the day the European Union declares war on #end-to-end encryption and demands access to everyone’s private messages on any platform in the name of child protection,” tweeted Alec Muffett, a leading security expert and former Facebook software engineer.

What the EU is proposing sounds a bit like Apple’s child safety plan, which the company introduced last summer only to retract a few months later. At the time, Apple said it would scan iMessages for users under 17 and notify them if they were about to send or receive what Apple’s systems considered images” sexually explicit”. If those children were under 13 and opted into the family plans, Apple would also notify their parents or guardians. Apple has also offered to scan iCloud content for known CSAM content and alert the National Center for Missing and Exploited Children, or NCMEC, when it detects above a certain threshold of content in a single account.

But Apple put the plan on hold following fierce opposition from privacy groups, as well as LGBTQ+ youth advocates, who said children could be at even greater risk if Apple reports them to criminals. abusive authority figures.

“It’s Apple again,” Green tweeted.

Parts of the plan may prove less controversial, such as the creation of a European version of the NCMEC, which has become a major known CSAM repository in the United States. The need to combat CSAM is, after all, urgent and growing, and it is essential that companies coordinate their efforts.

It may take years before a final version of the proposal is approved by member states and the European Parliament. Until then, tech giants and privacy groups are likely to fight it with all they have.