Regulation to Prevent and Combat Child Sexual Abuse

The Regulation to Prevent and Combat Child Sexual Abuse (Child Sexual Abuse Regulation, or CSAR) is a European Union regulation proposed by the European Commissioner for Home Affairs Ylva Johansson on 11 May 2022. The stated aim of the legislation is to prevent child sexual abuse online through the implementation of a number of measures, including the establishment of a framework that would make the detection and reporting of child sexual abuse material (CSAM) by digital platforms – known by its critics as Chat Control – a legal requirement within the European Union.[1][2]

Regulation proposal
European Union regulation
Text with EEA relevance
TitleProposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse
Journal reference[1]
Preparative texts
Commission proposalCOM/2022/209 final
Proposed

Background

edit

The ePrivacy Directive is an EU directive concerning digital privacy. In 2021, the EU passed a temporary derogation to it – called Chat Control 1.0 by critics – which allowed email and communication providers to search messages for presence of CSAM.[3][4] It was not mandatory and did not affect end-to-end encrypted messages. The purpose of CSAR – called Chat Control 2.0 by critics – is to make it mandatory for service providers to scan messages for CSAM, and to bypass end-to-end encryption.[3]

Support for the proposal

edit

Supporters of the regulation include dozens of campaign groups,[5] activists and MEPs, along with departments within the European Commission and European Parliament themselves. Opponents include civil society organisations and privacy rights activists.[6]

The European Commission's Directorate-General for Migration and Home Affairs argues that voluntary actions by online service providers to detect online child sexual abuse are insufficient. They emphasize that some service providers are less involved in combating such abuse, leading to gaps where abuse can go undetected. Moreover, they highlight that companies can change their policies, making it challenging for authorities to prevent and combat child sexual abuse effectively. The EU currently relies on other countries, primarily the United States, to launch investigations into abuse occurring within the EU, resulting in delays and inefficiencies.[7]

Several bodies within the EU claim the establishment of a centralized organization, the EU Centre on Child Sexual Abuse, would create a single point of contact for receiving reports of child sexual abuse.[7][1] It is claimed this centralization would streamline the process by eliminating the need to send reports to multiple entities and would enable more efficient allocation of resources for investigation and response.[7]

Proponents also argue for the need to improve the transparency of the process of finding, reporting, and removing online child sexual abuse material. They claim that there is currently limited oversight of voluntary efforts in this regard. The EU Centre would collect data for transparency reports, provide clear information about the use of tools, and support audits of data and processes. It aims to prevent the unintended removal of legitimate content and address concerns about potential abuse or misuse of search tools.[7]

Another aspect highlighted by supporters is the necessity for improved cooperation between online service providers, civil society organizations, and public authorities. The EU Centre is envisioned as a facilitator, enhancing communication efficiency between service providers and EU countries. By minimizing the risk of data leaks, the Centre aims to ensure the secure exchange of sensitive information. This cooperation is crucial for sharing best practices, information, and research across different countries, thereby strengthening prevention efforts and victim support.[7]

Criticism of the proposal

edit

Groups opposed to this proposal often highlight that it would impose mandatory chat control for all digital private communications, and as such commonly refer to the proposed legislation by the name "Chat Control".[8][9][10] Civil society organisations and activists have argued that the proposal is not compatible with fundamental rights, infringing on the right to privacy.[11][12] Moreover, the proposal has been criticised as technically infeasible. In Ireland, only 20.3% of the reports received by the Irish police forces turned out to be actual exploitation material. Specifically, from a total of 4192 reports received, 471 i.e. more than 10% were false positives.[13]

The European Parliament commissioned an additional impact assessment on the proposed regulation which was presented in the Committee on Civil Liberties, Justice, and Home Affairs.[14] The European Parliament's study heavily critiqued the Commission's proposal. According to the Parliament's study, there aren't currently any technological solutions that can detect child sexual abuse material, without resulting in a high error rate which would affect all messages, files and data in a particular platform.[15] In addition, the European Parliament's study concluded that the proposal would undermine end-to-end encryption and the security of digital communications. Lastly, the study highlighted that the proposed regulation would make teenagers "feel uncomfortable when consensually shared images could be classified as CSAM".[15]

The Council of the European Union's Legal Service also criticised the impact of the Commission's proposal on the right to privacy. The Council's legal opinion emphasized that the screening of interpersonal communications of all citizens affects the fundamental right to respect for private life as well as the right to the protection of personal data.[16] The legal experts of the Council also referenced the jurisprudence of the EU Court of Justice, which has ruled out against generalised data retention.[17]

The European Data Protection Supervisor (EDPS) together with the European Data Protection Board (EDPB) stated, in a joint opinion, that "the Proposal could become the basis for de facto generalized and indiscriminate scanning of the content of virtually all types of electronic communications", which could have chilling effects on sharing legal content.[18]

In March 2023, introduced a revised version of the proposal, which Germany's Digital Affairs Committee noted drew strong opposition from several groups. The new scheme, referred to as "Chat Control 2.0", proposed to implement scanning on encrypted communications.[19] In April 2023, the European Parliament confirmed that they had received messages calling to vote against the European Commission's chat control proposal.[20] Citizens expressed their concerns that the new legislation would breach data protection and privacy rights.

EU Commissioner Ylva Johansson has also been heavily criticised regarding the process in which the proposal was drafted and promoted. A transnational investigation by European media outlets revealed the close involvement of foreign technology and law enforcement lobbyists in the preparation of the proposal.[21] This was also highlighted by digital rights organisations, which Johansson rejected to meet on three occasions.[22] Commissioner Johansson was also criticised for the use of micro-targeting techniques to promote its controversial draft proposal, which violated the EU's data protection and privacy rules.[23]

Legislative process

edit

On November 14 2023, the European Parliament's Committee on Civil Liberties, Justice, and Home Affairs (LIBE), voted to remove indiscriminate chat control and allow for the targeted surveillance of specific individual and groups which are reasonably suspicious. Moreover, Members of the European Parliament voted in favour of the protection of encrypted communications.[24]

In February 2024, the European Court of Human Rights ruled, in an unrelated case, that requiring degraded end-to-end encryption "cannot be regarded as necessary in a democratic society". This underlined the European Parliament's decision to protect encrypted communications.[25]

In May 2024, Patrick Breyer reported that moves were again being made to restore indiscriminate message scanning to the legislation, under the name of "upload moderation".[26]

On 21 June, it was reported that voting on the legislation had been temporarily withdrawn by the EU Council, in a move that is believed to be the result of pushback by critics of the proposal including software vendors.[27][28]

References

edit
  1. ^ a b "Press corner". European Commission - European Commission. Archived from the original on 2021-04-27. Retrieved 2023-07-12.
  2. ^ "EU plans new law to combat child abuse – DW – 01/09/2022". dw.com. Archived from the original on 2023-02-04. Retrieved 2023-02-04.
  3. ^ a b Claburn, Thomas (3 Mar 2023). "German Digital Affairs Committee hearing heaps scorn on Chat Control". The Register.
  4. ^ "Regulation - 2021/1232 - EN - EUR-Lex". eur-lex.europa.eu. Retrieved 2024-09-13.
  5. ^ "Letter from ECLAG to MPs" (PDF). Archived (PDF) from the original on 2024-04-11. Retrieved 2023-07-12.
  6. ^ "Defend encryption! Open letter to the EU urging them to protect your privacy". Retrieved 2024-06-17.
  7. ^ a b c d e "EU centre to prevent and combat child sexual abuse". home-affairs.ec.europa.eu. Archived from the original on 2023-07-12. Retrieved 2023-07-12.
  8. ^ Kabelka, Laura (2022-06-27). "Bundestag quarrels over retaining IP data to fight child abuse". www.euractiv.com. Archived from the original on 2023-02-04. Retrieved 2023-02-04.Kabelka, Laura (2022-10-11). "MEPs sceptical on EU proposal to fight online child sexual abuse". www.euractiv.com. Archived from the original on 2023-02-04. Retrieved 2023-02-04.Pollet, Mathieu (2022-05-10). "LEAK: Commission to force scanning of communications to combat child pornography". www.euractiv.com. Archived from the original on 2023-02-04. Retrieved 2023-02-04."Report slams German opposition to new child sexual abuse rules". EUobserver. 2022-07-05. Archived from the original on 2023-02-04. Retrieved 2023-02-04."Chat Control: The EU's CSEM scanner proposal". Patrick Breyer. Archived from the original on 2023-03-10. Retrieved 2023-03-10.
  9. ^ "A beginner's guide to EU rules on scanning private communications: Part 2". European Digital Rights (EDRi). Archived from the original on 2023-03-10. Retrieved 2023-03-10.Vincent, James (2022-05-11). "New EU rules would require chat apps to scan private messages for child abuse". The Verge. Archived from the original on 2023-03-10. Retrieved 2023-03-10.
  10. ^ Mullin, Joe (2022-10-19). "EU Lawmakers Must Reject This Proposal To Scan Private Chats". Electronic Frontier Foundation. Archived from the original on 2023-03-15. Retrieved 2023-03-10.
  11. ^ "European Commission must uphold privacy, security and free expression by withdrawing new law, say civil society". European Digital Rights (EDRi). Archived from the original on 2023-04-07. Retrieved 2023-04-07.
  12. ^ "Chat control: incompatible with fundamental rights". GFF – Gesellschaft für Freiheitsrechte e.V. Retrieved 2024-06-18.
  13. ^ Cronin, Olga (2022-10-19). "An Garda Síochána unlawfully retains files on innocent people who it has already cleared of producing or sharing of child sex abuse material". Irish Council for Civil Liberties. Archived from the original on 2023-03-14. Retrieved 2023-04-07.
  14. ^ "EU Parliament study slams online child abuse material proposal". www.euractiv.com. 2023-04-13. Archived from the original on 2023-06-02. Retrieved 2023-06-02.
  15. ^ a b "Proposal for a regulation laying down the rules to prevent and combat child sexual abuse: Complementary impact assessment" (PDF). European Parliamentary Research Service. April 2023. Archived (PDF) from the original on 2023-06-03. Retrieved 2023-06-02.
  16. ^ "EU Council's legal opinion gives slap to anti-child sex abuse law". www.euractiv.com. 2023-05-09. Archived from the original on 2023-06-01. Retrieved 2023-06-02.
  17. ^ "Opinion of the Legal Service: Proposal for a Regulation laying down rules to prevent and combat child sexual abuse" (PDF). Council of the European Union. Archived (PDF) from the original on 2023-06-01. Retrieved 2023-06-02.
  18. ^ "EDPB-EDPS Joint Opinion 4/2022 on the Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse" (PDF). European Data Protection Supervisor. 2022-07-28. p. 20. Archived (PDF) from the original on 2023-09-21. Retrieved 2023-10-16.
  19. ^ Claburn, Thomas (2023-03-03). "German Digital Affairs Committee hearing heaps scorn on Chat Control". The Register. Archived from the original on 2024-03-07. Retrieved 2024-04-10.
  20. ^ "Citizens' enquiries on the EU's proposal to address child sexual abuse online". Epthinktank. 2023-04-04. Archived from the original on 2023-04-06. Retrieved 2023-04-07.
  21. ^ "'Who Benefits?' Inside the EU's Fight over Scanning for Child Sex Content". Archived from the original on 2023-11-17. Retrieved 2023-11-18.
  22. ^ "Commissioner Johansson cannot be trusted with the EU's proposed CSA Regulation". European Digital Rights (EDRi). Archived from the original on 2023-11-18. Retrieved 2023-11-18.
  23. ^ Tar, Julia (2023-10-16). "EU Commission's microtargeting to promote law on child abuse under scrutiny". www.euractiv.com. Archived from the original on 2023-11-18. Retrieved 2023-11-18.
  24. ^ "Detect child abusers without mass scanning". www.eppgroup.eu. Archived from the original on 2023-11-18. Retrieved 2023-11-18.
  25. ^ Claburn, Thomas (15 February 2024). "European Court of Human Rights declares backdoored encryption is illegal". The Register. Archived from the original on 18 February 2024. Retrieved 18 February 2024.
  26. ^ "Majority for chat control possible – Users who refuse scanning to be prevented from sharing photos and links". Patrick Breyer. 2024-05-31. Retrieved 2024-06-02.
  27. ^ Ivanovs, Alex (2024-06-20). "EU Council has withdrawn the vote on Chat Control". Stack Diary. Retrieved 2024-06-21.
  28. ^ "EU 'chat-control' plan goes back to drawing board". Brussels Signal. 2024-06-20. Retrieved 2024-06-21.
edit