Poland will “never agree” to any EU legislation that would require instant messaging services such as Messenger, WhatsApp or Signal to monitor users for evidence of child abuse, the country’s digital affairs minister said.

Society Poland opposes mandatory EU ‘chat monitoring’ law to combat child abuse fot. TVP Maria Kamińska Edited by: Piotr Kononczuk 17.11.2025, 17:03 Photo: Envato/Kira_Yan, PAP/Radek Pietruszka Digital Affairs Minister Krzysztof Gawkowski said Poland would “never agree" to any mandatory scanning of private communications. Photo: Envato/Kira_Yan, PAP/Radek Pietruszka Poland will “never agree” to any EU legislation that would require instant messaging services such as Messenger, WhatsApp or Signal to monitor users for evidence of child abuse, the country’s digital affairs minister said.

Politics Under a revised version of the EU Child Sexual Abuse Material (CSAM) regulation – approved for further work by an EU Council working group last week – instant messaging providers could voluntarily agree to scan users’ communications for child sexual abuse content.

This marks a watering down of an initial proposal – first presented in 2022 – which would have made message scanning mandatory for all platforms, including those offering end-to-end encryption designed to prevent unauthorized access to private communications.

The legislation has raised concerns over potential violations of privacy rights and has been repeatedly revised, so far failing to secure majority support among EU member states.

Commenting on the revised proposal, Poland’s Digital Affairs Minister Krzysztof Gawkowski, told state news agency PAP on Monday that his country would “never agree to any mandatory scanning", citing concerns over the privacy of communications.

“We are treating the search for a compromise on child protection as a priority and with great consideration,” he said.

“We want legislation that enables us to effectively combat paedophilia while at the same time ensuring the security of all citizens,” he added.

Gawkowski said the latest proposal – put forward by Denmark, which took over the rotating EU Council presidency from Poland in July – aligns with the approach Warsaw advocated during its own presidency of the Council in the first half of 2025.

He also said his team will “monitor the issue” as talks progress, adding that the Polish government’s position would depend on the final draft regulation.

Continue reading - https://tvpworld.com/90062380/poland-against-compulsory-eu-messaging-scans-to-fight-child-abuse

  • I_Has_A_Hat@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    10時間前

    I mean this in the most delicate way possible…

    Do we even have a CSAM problem on the internet? Like, I’m sure it exists, and should (hopefully) go without saying that CSAM is 100% horrific and disgusting; but to me, it’s not exactly something that I see running rampant. It’s one of those things I feel like you would have to intentionally go looking for on the deepest, darkest corners of the Dark Web to find.

    I just keep seeing all this anti-privacy legislation come up in the name of stopping CSAM. And yea, again, 100% in agreement that CSAM is awful and those that create/share it deserve hell. But… Haven’t we already kind of taken care of it? I know the majority of these legislative arguments are BS anyways, and the people arguing don’t actually give a shit about kids online, but the arguments don’t even make sense anymore.

    As a parallel, take bigots who argue against LGBT rights. Their arguments are usually disingenuous and clearly pushing an alternative agenda; but even those lunatics have stopped trying to claim the reason for their hatred is because gays are all diseased and full of AIDS, because most people are well aware that, not only is AIDS not limited to gay people, but AIDS itself is far less of an issue than it was in decades past. Like AIDS is still a terrible disease, but we kind of have it under control at this point and it’s silly to talk about as if it’s still a crisis.

    Do we even have a CSAM crisis that warrants any of this? Even in the heads of those too dumb to realize these things aren’t about CSAM at all?

    • tomi000@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4時間前

      It’s one of those things I feel like you would have to intentionally go looking for on the deepest, darkest corners of the Dark Web to find.

      Yes, but the problem isnt the internet being flooded by CSAM out of nowhere. Its people organizing and sharing through encrypted messaging. And yes, Im sure its a real problem, but taking away peoples privacy will not fix it.

    • Hannes@feddit.org
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      9時間前

      At least in germany the police is already overwhelmed working with the information they get.

      Journalists have gotten more CSAM deleted from the internet that them - and only after these informed the police about it and the stuff was still online almost a year later…

      Also the judges of that area are way overworked and cases take ages to be decided.

      This whole thing is too much cryptofascism fantasy needing an unlimited police force to even work.

      There’s never been a more clear virtue signalling coupled with massive lobby interests than this law…

      To add to that: almost all experts in that field will tell you that they prefer the money that’d need to be spent on implementing this to be spent on prevention programs (and educating kids about sexuality asap to they know what’s wrong and when they have to speak with another adult about it) instead.

    • Cris@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      10時間前

      My impression is that we absolutely have an active issue with CSAM 😅

      Its tucked away in the corners, but I don’t think its hard to find if you go looking for it. And if its in an encrypted chatroom you really don’t know the scope of the issue there without like a sting to get into the room, and then you have no way of identifying anyone.

      That being said, there are huge tradeoffs to privacy and security to scan everything, and I’m not at all about to argue that there aren’t extremely bad faith reasons to push this kinds of legislation.

      But I don’t think the issue it aims at addressing is in any way shape or form fabricated

    • Vincent@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      9時間前

      The main problem with CSAM is not you encountering it, the problem is children being abused when producing it. Whether the material produced reaches a wide audience doesn’t impact their suffering. So what’s relevant is the scale at which it’s being produced, which I think is fairly independent of the likelihood of you encountering it without looking for it.

    • 100@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      10時間前

      exactly this wouldnt change anything when investigating these crimes

      you require law enforcement with enough resources to investigate cases they receive already

    • StraponStratos@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      10時間前

      I’ve heard huge amounts of it are stored and traded on the open web. Literally posting it on Twitter etc, but the networks are relatively closed so outsiders don’t see these social groups.