Giacomo Zandonini; Photo by Sanni Laine

By Lauren Beauchamp

In today’s digital world, billions of messages fly across the internet every second. They’re sent from friends sharing updates, families staying in touch and professionals conducting business. 

Yet hidden among these countless conversations are communications of a darker kind, linked to criminal activities, including the online distribution of child sexual abuse material, known in law-enforcement circles as CSAM. 

In May 2022, EU officials addressed the scourge of CSAM by developing controversial legislation to introduce mandatory “client-side scanning” of private communications for illegal content.

While few query the aim of protecting children, many saw the proposed legislation championed by EU Commissioner Ylva Johansson as taking a sledgehammer to privacy protections. 

Three journalists set out to dig deeper, unmasking the webs of interests supporting the legislation. The result was The Data Trap, a cross-border investigation supported by IJ4EU’s Freelancer Support Scheme.

For Giacomo Zandonini, a Rome-based investigative journalist who worked on the investigation along with Apostolis Fotiadis and Ludek Stavinoha, the story was not just about current events but a continuous narrative of increasing surveillance powers within the European Union.

“We’ve been monitoring the expanding role of EU security agencies like Europol and Frontex (the EU border agency) for years, looking at how the European Commission was pushing to broaden their mandate, with a model that brings us closer to renowned US agencies,” Zandonini said, citing the example of the U.S. Immigration and Customs Enforcement, the Federal Bureau of Investigation and the National Security Agency.

So when the proposed EU regulation came into question, Zandonini and his colleagues began to question it.

“According to the Commission’s proposal, entire platforms could be obliged to scan their users’ messages and material, in search of CSAM,” he said.

“The available tools are based on artificial intelligence and, we found out, pose particular problems specifically when operating in a fully encrypted environment, meaning in a number of applications we use daily, as journalists to secure our communication with sources, as citizens to preserve privacy.”

‘Refusal to disclose documents’

Under the proposed EU regulation, social media and messaging services may have to check our personal messages for illegal content, using intelligent software. But Zandonini and his team raised concerns about the accuracy of the technology and the potential invasion of privacy.

“Privacy experts kept repeating to us that once an infrastructure to identify CSAM was put in place, it could be easily redirected to search for other material,” Zandonini warned, noting the potential for a “surveillance nightmare” if such powers were misused.

The team’s persistence in seeking transparency was met with resistance.

“We were surprised by the continuous refusal of the Commission to disclose documents,” he said, adding that the team was forced to complain to the European Ombudsman. 

This sentiment was echoed after a significant development following the release of their investigation.

“After our investigation, in October 2023, the European Parliament asked the Commission to share all documents related to communication with stakeholders that were mentioned in our articles,” he said. 

“We were able to review the batch of documents provided by the Commission and were very surprised to notice that the list of documents was extremely incomplete. We had obtained documents that were not shared with the parliament, days before a crucial vote on its compromise text of the regulation. Was this a deliberate attempt at reducing scrutiny over some problematic relations?”

The team is still awaiting clarity on the matter.

Through hundreds of documents obtained through leaks and freedom-of-information requests, the team uncovered a web of commercial interests extending all the way to Silicon Valley, always operating under the guise of ethical action.

For example, Thorn, a US non-profit organisation that makes solutions to defend children from sexual abuse, sells the very technology that would “scan and cleanse” the internet.

“Formally, there is no profit at the end of the year, no redistribution of gains or bonuses to managers,” Zandonini said. “But the more companies or institutions need to cleanse their platforms from content which could be illegal, the more Thorn attracts new donations.”

He pointed to the organisation’s sizable assets and contracts with law enforcement as evidence of the scale of the business opportunity.

Public concern

The investigation into the EU’s proposed regulation for scanning online content for child sexual abuse material has stirred a significant reaction, not just within the corridors of power but also among the wider public.

“The response from the office of Commissioner for Home Affairs, Ylva Johansson, has been particularly noteworthy, as she made efforts to downplay the investigation’s findings,” Zandonini wrote in a report about the investigation’s impact. Members of the European Parliament sought clarifications, inviting the investigative team to Parliament for a meaningful exchange with Johansson herself.

And it’s not just politicians who’ve been roused. Privacy groups, organisations representing abuse victims and individuals affected by such crimes have all reached out, underscoring deep concerns over the balance between security and privacy.

The reach of the investigation has even extended into the academic and technological spheres, with prominent experts like cryptographer Matthew Green and Meredith Whittaker, president of the Signal Foundation, lending their voices to the conversation.

Their involvement, alongside public seminars by authorities such as the European Data Protection Supervisor, demonstrates the investigation’s influence in shaping discussions on privacy and security.


For more on this IJ4EU-supported investigation, see The Data Trap.

web: KontraBit