Web blocks: EU Council wants to expand planned surveillance even further
There are important voices in the EU Council of Ministers who want to significantly expand the planned regulation on online surveillance under the peg of the fight against sexual abuse of children. The member states are less concerned with the particularly controversial chat control than with the clauses for web blocks, which have received little attention up to now. EU countries are about to pour grist on the mills of critics that this tool can easily be misused for extensive censorship measures.
Search also for unknown material
According to a draft by the former Czech Council Presidency, Internet providers should even be obliged to block hitherto unknown depictions of child sexual abuse. The responsible government representatives simply deleted the addition “known” in front of “Child Sexual Abuse Material” (CSAM) in the original proposal of the EU Commission. So it shouldn’t just be about representations that complaints offices and authorities have already checked and classified as illegal.
This change to Article 16 of the regulation would mean that access providers would have to monitor the content of all users’ internet traffic and rely on error-prone, algorithm-driven artificial intelligence (AI) technologies to detect unknown CSA material. “This type of general surveillance of Internet traffic is prohibited under EU law,” counters the civil rights organization European Digital Rights (EDRi) in a recently published analysis. There is no exception here for blocking orders. In practice, the massive analysis of data packets in the network, which is necessary for such a blockade, is technically impossible if the communication is encrypted.
The paper by the Czech presidency, which handed over the baton to Sweden in early 2023, is primarily based on recommendations from the ministerial working group on law enforcement. This is surprising, since the Commission actually wants to harmonize the rules for the internal market with the dossier. In the area of internal security, it has only limited powers for legislative initiatives. The document, initially classified as confidential, dates back to September, but was only released to the public later.
No proportionality test
In addition, the Council’s proposed amendments would remove essentially all the safeguards in the Commission’s draft around blocking. For example, corresponding orders could also be issued for content that is hosted within the EU. “Competent” offices should be responsible for issuing them, not just judicial authorities or comparable independent institutions. Police authorities could also, as the no less contested ordinance on terrorist online content also provides. The initially required proportionality test should also be omitted.
The hardly recognizable protection guarantees and the lack of independent supervision result in “a poisonous combination that creates a high risk of excessive deletion of legal content,” complains EDRi. At the same time, the changes may make the fight against online dissemination of abusive content less effective, as the incentives to block are stronger than those to remove the content at source. In this country, this would violate the principle of “deleting instead of blocking”. The Council still has to decide its final position for later negotiations with the EU Parliament. So far, the federal government has not been able to agree on a common official line.
EDRi sees a major problem with the Commission’s proposal in its practical feasibility: “Blocking orders at URL level are technically impossible if HTTPS is used to access a website.” The full URL is encrypted throughout between the user’s browser and the web server. HTTPS has become the de facto standard for Internet traffic. This left only blockages at the level of the domain name system (DNS blocks). Entire websites would be blocked, increasing the overblocking problem and requiring a much more thorough proportionality test.
Allegations against the EU Commission
Researchers at TU Delft, meanwhile, have found that the Commission made some incorrect, or at least contradictory, statements in promoting the draft. Of six publicly made claims, three were incorrect, reports the “Euractiv” portal. For example, Interior Commissioner Ylva Johansson explained in an interview with the Dutch newspaper “Trouw” that according to estimates by the Council of Europe, one in five children is a victim of sexual abuse on the Internet. According to the scientists, the number includes both online and offline cases.
At the press conference presenting the initiative, the commission also revealed that the number of CSAM notifications had increased by 6,000 percent over the past ten years. This contradicts a “fact sheet” published at the same time, which states an increase of 4200 percent. The EU executive believes the larger number is correct, citing data from the US National Center for Missing and Exploited Children (NCMEC) from 2010 to 2020. Johansson also said that 90 percent of the world’s abuse reports are stored on servers in the EU be. In the information paper, the commission shows a size of “over 60 percent”. It had already become known in October that the government body had unchecked the allegedly very high precision rate of tools for detecting CSAM from Meta and Hollywood star Ashton Kutcher.
(my)
To home page