Key negotiators in the European Parliament have announced making a breakthrough in talks to set MEPs’ position on a controversial legislative proposal aimed at regulating how platforms should respond to child sexual abuse risks.
The European Union’s executive body, the Commission, presented a proposal for a regulation in this area last year but the plan has generated major controversy — with warnings the planned legislation could see platforms served detection orders that mandate the scanning of all users’ private messages. The draft proposal includes a requirement for platforms served with detection orders to scan for known and unknown child sexual abuse material (CSAM) and also try to pick up grooming activity taking place in real time.
Legal experts have warned such blanket, untargeted scanning risks breaching the EU’s prohibition on general online monitoring. While civil society organisations, privacy and security experts and others have denounced the push for non-targeted scanning of private messages as a dangerous tipping point for fundamental rights in a democratic security which could leave everyone — kids included — more not less vulnerable, as seminar organized by the European Data Protection Supervisor heard earlier this week.
The level of alarm over the Commission’s proposal appears to have galvanized MEPs to try to find an alternative way forward. Today key parliamentarians working on the file, the rapporteur and shadow rapporteurs, presented a rare united front at a joint press conference that spanned the different political groups.
The MEPs said they had reached agreement on a substantially revised version of the draft legislation.
Key changes parliamentarians have found agreement over — on the detection side — include putting a number of limits on scanning. Firstly their proposal would limit scanning to individuals or groups who are suspected of child sexual abuse (making it targeted, not blanket); it would also limit it to only known and unknown CSAM (removing the requirement to scan for grooming); and — importantly — it would limit scanning to platforms that are not end-to-end-encrypted (E2EE); thereby removing the risk the legislation could force E2EE platforms to backdoor or weaken their security.
Summarizing the approach rapporteur, Javier Zarzalejos, told assembled journalists: “We have tried to take a comprehensive view in the fight against child sexual abuse online, which needs to develop a variety of strategies to succeed. There is no massive scanning or general monitoring of the web. There is no indiscriminate scanning of private communications, or backdoors to weaken encryption. There are no legal or technical shortcuts — but there is a positive and compelling duty to put in place legal tools to prevent and combat these heinous crimes.”
“Only if providers do not comply with the obligation set out in the regulation, and as a measure of last resort, the judicial authority — and only a judicial authority in view of the parliament — will be able to issue a detection order, which means that the provider will have to deploy certain technologies to detect known and new child sexual abuse material,” he added.
“Our aim in this regulation is to lay down uniform rules. All providers will have to assess the risk of misuse of the service for the dissemination of child sexual abuse material or for the solicitation of children and put in place measures to mitigate those risks when necessary to detect report and remove such abuse.”
While the EU’s other co-legislator, the European Council, has stuck mostly to the Commission’s original CSAM-scanning proposal — and has so far failed to set its negotiating position on the file — MEP Patrick Breyer, one of several shadow rapporteurs on the file, said parliamentarians took a different tack to steer out of contested waters.
“We decided for to go for a new and consensual approach to this file by removing the contested and problematic points — such as bulk scanning of entire services [and] even for end-to-end encrypted services, or mandatory age verification for all communication services, or even excluding all children under 16 from commonplace apps — and instead, we added to the original proposal more effective and court proof and rights respecting measures to keep children safe online.”
One example of a new measure the parliament is proposing is for the EU Centre, a body which the regulation would establish to receive and check CSAM reports, to also be able to carry out searches on hosting service providers’ publicly accessible content — such as similar child protection centres in the U.S. and Canadian already do.
That kind of “proactive scanning” would, Breyer suggested, help clean up the Internet — without intruding into anyone’s private messages. He also pointed out it could be used on the Darknet — “so it’s more effective”.
On the prevention side, MEPs are pushing for safety by design requirements — meaning in-scope platforms would have to, for example, default profiles to being non-public; and ask users before they receive messages or seen images.
Another change parliamentarians have proposed that’s geared towards protecting victims from re-victimization (i.e. where CSAM depicting their abuse continues circulating and/or being re-shared), is to put a removal obligation on providers — so hosting services would have to take down CSAM, not just provide information to victims on request. MEPs also want law enforcement to have a role in ensuring CSAM material they’re aware of is removed from the Internet.
The parliamentarians detailed a raft of other changes — and, clearly, a lot of work has gone into rethinking how best to revise a sensitive file so that it centers the rights of victims (and potential victims) without riding roughshod over everyone’s rights to freedom of expression, privacy and security.
“Altogether I think the winners of this agreement are the children precisely because they need protecting so much because this crime is so insidious,” said Breyer. “They deserve an effective response and a rights respecting response that will uphold in court and the winners are all of us because our privacy of correspondence and security of our communications is guaranteed by our proposal.
“And so it’s good that we stand here together and have a message to Council and the Commission, indeed as colleagues have said, that a new consensus approach on this file is the only way forward — to move forward with this proposal.”
The next step will be for committee votes to take place on the amended file. After which the parliament as a whole will need to vote to confirm its negotiating mandate. But given the show of political unity today that step looks assured.
What follows after that is more uncertain. Attention — and pressure — will turn to the Council, the body made up of representatives of Member States governments, which has yet to achieve a common position on the proposal but must do so in order for ‘trilogue’ talks to open with MEPs.
Those talks are where the EU’s co-legislators collectively hash out a compromise that decides the final shape of the law. So everything is still up for grabs on this file.
During the press conference, Paul Tang, another of the shadow rapporteurs, also urged the Council to take inspiration from the show of unity among the parliament’s political groups — despite, as was pointed out during the press conference, talks having started with MEPs holding some very different positions and views on how best to process on this sensitive file.
“I would like to see that the Council, which gets stuck on the difficult discussion on the detection order, takes example from the European Parliament. We are here a united European Parliament and that’s virtually impossible — certainly in this fight — but this is a strong signal in my mind to the Council,” he said. “Hurry up. Look at our proposal. Consider it. This is the way forward.”
Tang also directed some political remarks at the Commission — urging it get behind the parliament’s compromise, rather than entrenching on its original proposal and continuing to push for non-targeted surveillance measures that experts agree run counter to EU laws and fundamental rights.
“Now it’s time if you want to make the Internet a safer place for children to seriously consider the parliament’s proposal,” he said, warning time is short to clinch agreement. A temporary derogation that currently allows platforms to scan non-E2EE messages for CSAM expires next summer. Plus there are EU elections next year — and if the file isn’t finalized before a turnover of the EU’s college there’s a risk of further delays and uncertainty.
“We know that there is a temporary derogation that still allows for some scanning and which is a source for police to find material that is helpful in their investigations. I’m afraid that once Facebook will encrypt, end-to-end, Facebook Messenger the source will dry up and we need to have something in place where we can make the Internet safer for children. So please, Commission and Council. Hurry up — get on board,” Tang added.
Asked whether he’s confident the Council will switch its view and accept scanning which can only be targeted to individuals or groups where a judge agrees there’s objective evidence giving rise to a suspicion of involvement in child sexual abuse or CSAM — rather than, as the Commission proposal allows, untargeted scanning of all messages for any CSAM when a service is subject to a detection order — Tang also told TechCrunch: “We very much hope that the Council will listen very closely. The good thing is there are some Spanish MEPs, including the rapporteur of this file, that are willing to take action — given that we have the Spanish presidency [of the Council]. I still hope — though time is limited — that we can be at an effective negotiation and a result before the next elections.”
During the press conference, Cornelia Ernst, another of the shadow rapporteurs, was less hopeful sounding. She pointed to the entrenched position the Commission has adopted in the face of major criticism as a negative sign for how things could go in the three way talks with Member States, where the EU’s executive also plays an active role in facilitating negotiations.
“When it comes to Council, I’m not as optimistic,” she warned. “I would have preferred for us to be able to decide on this alone but the EU functions in a different way. We’re going to have a very, very hard fight with Council in trilogue. And the more we stand united, the better our chances of success of course.”
This report was updated to fix a typo: ‘quote proof’ has been corrected to ‘court proof’ in Breyer’s remarks. Our report also initially suggested a change the parliamentarians have proposed, putting a CSAM removal obligation on providers, would apply to messaging apps; in fact the intended target is hosting providers
Europe’s CSAM-scanning plan is a tipping point for democratic rights, experts warn
EU commissioner sidesteps MEPs’ questions about CSAM proposal microtargeting
Comment