Brussels is bracing for one of its biggest and most emotional tech fights yet as companies face stringent new rules to clamp down on sexual abuse material.
The Commission is expected to release a draft law this week that could require digital companies like Meta Platforms, Google and Apple to detect, remove and report illegal images of abuse to law enforcement under threat of fines.
According to a leak of the proposal obtained by POLITICO on Tuesday, the Commission said voluntary measures taken by some platforms have so far “proven insufficient” to address the misuse of online services for the purposes of child sexual abuse.
The rulebook comes as child protection hotlines report a record amount of disturbing content circulating online during the coronavirus pandemic. Europe is a hot spot for hosting such content, with 62 percent of the world’s illegal images located on European data servers in 2021.
The law has already been delayed by a year due to complex negotiations on a temporary bill that clarified that tech companies can voluntarily check for child abuse on their platforms. There was also internal pushback within the Commission over concerns on how legislation will affect privacy.
After months of lobbying, tech companies and children’s rights organizations are anxiously waiting to see how drastic the rules will be.
Home Affairs Commissioner Ylva Johansson has been bullish about finding ways to crack down on abusive online content, including photos, livestreams and conversations where offenders groom kids to film themselves.
“There’s a mountain of undetected suffering underneath and if it’s voluntary, companies can change their policies whenever they like, which is why I want to make detection of child sexual abuse mandatory,” said the Swedish politician on April 25 at an event organized by a child protection group, WeProtect Global Alliance.
Digital rights activists are deeply worried that the rules could severely weaken privacy, at a time when law enforcement and national governments are pushing hard to find ways around encrypted messaging services.
All eyes this week will be on the requirements for digital companies to look for illegal pictures of sexual abuse amid the vast amounts of content circulating on their platforms and in conversations on apps and messaging services.
“On the one hand, we’re excited to hear what the alternatives are that the Commission would like to suggest, but on the other hand, we’re worried about what that actually means,” said Siada El Ramly, director general of Dot Europe, a tech lobby representing tech firms like Apple, Meta Platforms, Google and TikTok.
El Ramly and her members are keen to see how the Commission plans to ensure the rules work with a prohibition on general monitoring, a practice — deemed illegal by the Court of Justice of the European Union — where tech companies would scan every single piece of user-generated content.
“The idea that all the hundreds of millions of people in the EU would have their intimate private communications, where they have a reasonable expectation that that is private, to instead be kind of indiscriminately and generally scanned 24/7 is unprecedented,” said Ella Jakubowska, policy adviser at European Digital Rights (EDRi), a network of 45 nongovernmental organizations.
Beyond the scanning of content, activists as well as tech companies, are concerned that the EU executive could seek to create backdoors to end-to-end encrypted messaging services, where secure conversations cannot be intercepted, which offers protection for journalists, lawyers and NGOs, especially in authoritarian countries.
“Abusers hide behind the end-to-end encryption; it’s easy to use but nearly impossible to crack, making it difficult for law enforcement to investigate and prosecute crimes,” said Johansson at the April 25 event.
The commissioner said technical solutions existed to keep conversations safe while finding illegal content, something that many cybersecurity experts doubt given the state of technology.
“The EU shouldn’t be proposing things that are technologically impossible,” said Jakubowska.
The stage is now set for a tough battle between privacy advocates and law enforcement hawks. Reacting to the leak of the proposal, centrist Renew Europe MEP Moritz Körner said the Commission’s proposal would mean “the privacy of digital correspondence would be dead.”
Yet, for child protection organizations, the law is strongly needed.
“We’ve seen a 374 percent increase in self-generated indecent images of children in the last two years alone,” said Michael Tunks, senior policy manager at the Internet Watch Foundation, a British nonprofit helping victims take down pictures and videos of abuse.
“Proactive searching and empowering hotlines [has] really got to be a part of this.”
This article has been updated.