Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

  • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    29
    ·
    edit-2
    1 year ago

    Sorry to hear about your investment in lemmy. How much did you end up investing? It just sounds like you’re very unsatisified with the value that lemmy has provided.

    Personally, I don’t pay for lemmy. Lemmy is free as for as I understood it. As it being free, I can’t really dictate the legal risk that the admins have to go through, as I do not have power over them, and because I treat them as humans.

    But yeah, I guess if you have a good reason, they really should be falling over backwards to moderate all the CSAM away from your favorite community. You are an all-powerful being.

    Edit: Sarcasm on the internet doesn’t work well so let me be frank: admins aren’t responsible for going to jail for a user’s desire to post CSAM. admins have a right to shut down a community that posts CSAM or remove CSAM or any material they find objectionable from their site. Admins take on the legal risk of the content on the site and OWE USERS NOTHING. Y’all can “the customer is always right” all you want but if you aren’t the one paying you aren’t the customer and you aren’t right.

    • wanderingmagus@lemm.ee
      link
      fedilink
      arrow-up
      19
      arrow-down
      1
      ·
      1 year ago

      I feel like you didn’t actually read their comment before posting, !dipshit@lemmy.world

      It has nothing to do with Lemmyshitpost being their “favorite community” and they never mentioned “investing” or “value”. That’s all from you. Stop strawmanning their position. They were criticizing the ease with which entire communities can be taken down by single individuals. Additionally, it seems you are contradicting your own post from 20 minutes prior to your current comment. Perhaps you responded to the wrong comment?

      • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        13
        ·
        edit-2
        1 year ago

        I read it. Let’s read it together again.

        I hope the devs take this seriously as an existential threat to the fediverse.

        The developers who build lemmy aren’t able to put in CSAM blocking code. That’s not how this works. I assume the commenter meant to say “admins” here, as developers write the code, they don’t admin the sites. If a developer has a lemmy instance they admin, they they are both a dev and an admin. Lemmy wasn’t built for CSAM sharing specifically, it is a site that allows sharing of CSAM as much as reddit or facebook do. The devs can’t do much about this. The admins and mods can.

        Lemmyshitpost was one of the largest communities on the network both in AUPH and subscribers.

        Neat. Irrelevant, but cool.

        If taking the community down is the only option here, that’s extremely insufficient and bodes death for the platform at the hands of uncontrolled spam.

        This I take issue with, and is what I mostly responded to.

        “If taking the community down is the only option here” well no, it’s not. We could just get 100’s of mods to specifically address this one user’s posting of CSAM. Hey, anyone want to moderate the site? Oh right, and they’ll need to be vetted, and they’ll need to keep doing this on the side for free as volunteers since lemmy is volunteer run…

        “that’s extremely insufficient” hard disagree. A community is liable for the content on it. If we put a CSAM post up on a site and leave it around for a few minutes, that’s one thing. If it’s left up for days and weeks, that’s quite another problem entirely. The minute that an admin or mod saw CSAM material, they did the right thing by shutting that down. Even if it means downtime for users. Oh no! Users can’t read lemmyshitpost and now the world is ending.

        “and bodes death for the platform at the hands of uncontrolled spam.” Welcome to the internet, where all platforms are at the risk of uncontrolled spam. At first it was just email, but then it was bulletin boards, and then message boards, and then forums, and then community-moderated forums like reddit and lemmy. This has and will be a problem. This isn’t a new concern for lemmy devs or admins or mods, they all are aware that this can happen and is why they do what they do. Turning off the community is a viable option, and is what has happened in larger companies too while they cleaned up the mess.

        Additionally, it seems you are contradicting your own post from 20 minutes prior to your current comment. Perhaps you responded to the wrong comment?

        I’ve been very consistent in my arguments. Show me the contradiction and I’ll address it.

        TL;DR: users cannot expect to be allowed to post CSAM material on lemmy instances. Allowing CSAM material to be up on lemmy instances constitutes a legal risk for admin owners, and thus we cannot leave it up. Blocking a community (even if it’s like the bestest and most favorited and most subscribed and everyone loves it and wow just super-duper community) is a viable means of blocking all CSAM on that community while it is cleaned up. To suggest that the community should have stayed online is assinine. To suggest that the admins should not have blocked a community to combat CSAM is assinine. Trust admins to do their jobs.

        • wanderingmagus@lemm.ee
          link
          fedilink
          arrow-up
          11
          ·
          1 year ago

          They aren’t asking devs to be admins or for admins to be devs. They specifically called out the developers because code exists to filter child sexual abuse material, disseminated by organizations such as the FBI and law enforcement, which can be implemented for image uploading.

          NOBODY in this comment section is advocating for uploading fucking child sexual abuse material. That is a strawman you are setting up. Nobody is advocating for allowing the uploading of child sexual abuse material, or for the “material to be up on lemmy instances”. NOBODY is suggesting that a single instance going down is “the world is ending”. NOBODY is asking for “100’s of mods to specifically address this one user’s posting of CSAM”.

          You’re setting up a strawman argument nobody is proposing. The criticism is that, at this moment, the developers of Lemmy have not implemented a method for automatically vetting uploaded images for CSAM without requiring “100’s of mods”, which is what resulted in the condition that “taking the community down is the only option here”.

          Perhaps the wording of the original post was not precise and accurate enough for your full and complete understanding of the intent and meaning behind it. In this post, I have attempted to elucidate that intent and meaning to a degree which I hope is understandable to you.

          • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
            link
            fedilink
            arrow-up
            3
            arrow-down
            12
            ·
            edit-2
            1 year ago

            They aren’t asking devs to be admins or for admins to be devs. They specifically called out the developers because code exists to filter child sexual abuse material, disseminated by organizations such as the FBI and law enforcement, which can be implemented for image uploading.

            Yeah? I doubt this is true but I could be wrong. You make it sound like preventing CSAM is as simple as importing a library, something I find dubious. Companies have been trying to filter out this material in an automated fashion for decades and yet they still have to employ humans to do it manually because automated means don’t really work. This is why companies like Reddit, facebook have trust and safety teams to do this work.

            Edit: I goggled and could not find this database. I’m thinking it’s a myth.

            NOBODY in this comment section is advocating for uploading fucking child sexual abuse material. That is a strawman you are setting up. Nobody is advocating for allowing the uploading of child sexual abuse material, or for the “material to be up on lemmy instances”. NOBODY is suggesting that a single instance going down is “the world is ending”. NOBODY is asking for “100’s of mods to specifically address this one user’s posting of CSAM”.

            ahem there were users who uploaded CSAM. Those are the users who were advocating for uploading CSAM, becuase they uploaded CSAM.

            I’m literally arguing with people who are saying that they shouldn’t have shut down the community because it’s big and that shutting down the community (not CSAM) poses an threat to the fediverse. Maybe, but CSAM poses a legal threat, which is much greater than the threat of low engagement.

            You’re setting up a strawman argument nobody is proposing. The criticism is that, at this moment, the developers of Lemmy have not implemented a method for automatically vetting uploaded images for CSAM without requiring “100’s of mods”, which is what resulted in the condition that “taking the community down is the only option here”.

            Yeah, that doesn’t exist, as I’ve mentioned previously. You make it sound like getting CSAM off lemmy was as simple as writing some code - if it were, why doesn’t facebook and reddit do this?

            Perhaps the wording of the original post was not precise and accurate enough for your full and complete understanding of the intent and meaning behind it. In this post, I have attempted to elucidate that intent and meaning to a degree which I hope is understandable to you.

            You’re not understanding how CSAM detection works or is handled.

            The grim reality is this: cameras exist, children exist, adults exist, the internet exists, and the second that a crime is committed, it is not added to an FBI database. If such as FBI database existed and IF it was useful (and not just a database of hashes for bit-perfect copies of CSAM) and IF it were updated when evidence of the crime surfaces… IF all of those things are true, THEN it means there’s still likely a huge swath of CSAM material still out there, that could be posted at any time, and that would NOT be detected.

            Again ask yourself, IF such a database existed, then WHY does reddit, twitter, facebook, hell, why doesn’t every or any site use it?

            Pedophiles, instead of downvoting me, why not explain yourself?

            • wanderingmagus@lemm.ee
              link
              fedilink
              arrow-up
              6
              ·
              1 year ago

              As another commenter posted below:

              But tools do exist. PhotoDNA by Microsoft. Although much more user-friendly implementation if you use Cloudflare, related links:

              As far as I am aware, every major site does use it in addition to manual vetting for any flagged “borderline” or “uncertain” results caught up in the filter.

              • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
                link
                fedilink
                arrow-up
                1
                arrow-down
                8
                ·
                1 year ago

                As far as I am aware, every major site does use it in addition to manual vetting for any flagged “borderline” or “uncertain” results caught up in the filter.

                I think this is where you could be wrong here. I appreciate the links, I’ll look into those in more detail. My best understanding is that these tools generate so many false-positives and false-negatives that it’s not worth using them. It may be a first line of defense until real humans get to see them, but my point is that humans are still needed. When humans are included because the system isn’t 100%, it means humans do the labor and as such, with limited time, humans need to determine when they can do the labor - sometimes shutting down a community is the best way to stop the flood while they clean up the mess.

                • Richard@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  1 year ago

                  This is just a matter of confirmation bias from your side now. You stubbornly refuse to accept factual information very helpfully delivered to you by users who have many better things to do than respond to your inquiries, and you dogmatically refuse to acknowledge that there are advanced and reliable automation tools available for the use case in question. And while you do all that, you belittle the other users in the community by referring to your supposedly superior knowledge and experience, however somehow failing to provide any data or secondary sources to back up your claims.

        • utopianfiat@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          The developers who build lemmy aren’t able to put in CSAM blocking code. That’s not how this works.

          They absolutely can, and every forum under the sun has tools and extensions to help with this. Fucking 4chan has code specifically dedicated to deal with CSAM. You have no clue what you’re talking about.

          Oh no! Users can’t read lemmyshitpost and now the world is ending.

          Replace this with !technology@lemmy.world, or !selfhosted@lemmy.world, or !announcements@lemmy.world. “Oh no, users can’t read the entire site” yes that is the definition of the end of the site.

          You’re not seeing that this isn’t a lemmyshitpost issue, it’s an “any popular community on lemmy” issue. Snarkily taking potshots at lemmyshitpost as a community doesn’t change it.

          Turning off the community is a viable option

          It’s not “not an option”, it’s the last resort. It’s like saying that your only option to seeing a roach in your apartment is to burn the whole building down. Because doing it means you don’t have a community anymore, and without communities the site has no purpose.

          • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
            link
            fedilink
            arrow-up
            3
            arrow-down
            9
            ·
            1 year ago

            They absolutely can, and every forum under the sun has tools and extensions to help with this. Fucking 4chan has code specifically dedicated to deal with CSAM. You have no clue what you’re talking about.

            Cool, name them and give me links then. I could not find such on the internet. There is software that tries to detect this, but even youtube’s algorithm is incorrectly flagging fully clothed 30 year old women as children.

            Links or you’re talking out your ass.

            Replace this with !technology@lemmy.world, or !selfhosted@lemmy.world, or !announcements@lemmy.world. “Oh no, users can’t read the entire site” yes that is the definition of the end of the site.

            Replace a site with CSAM and you’ll find it’s not a site you’ll want to go to in the first place. READ the original post where it was mentioned this is a stop-gap measure.

            You’re not seeing that this isn’t a lemmyshitpost issue, it’s an “any popular community on lemmy” issue. Snarkily taking potshots at lemmyshitpost as a community doesn’t change it.

            You’re… offended that I had snark on the topic of lemmySHITPOST? surely, you are joking.

            My point is not that this community is shit and that’s why this happened.

            My point is that this is a community on a lemmy instance that was flooded with CSAM, and was shutdown because of the flood of CSAM.

            It’s not “not an option”, it’s the last resort. It’s like saying that your only option to seeing a roach in your apartment is to burn the whole building down. Because doing it means you don’t have a community anymore, and without communities the site has no purpose.

            You do see how turning a community off and then on again isn’t the same thing as burning down a house (and unburning it again?)

            You do realize that we’re talking about a literal crime against children vs your ability to see memes? Fuck off with your self-importance.

            • utopianfiat@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              ·
              1 year ago

              Replace a site with CSAM and you’ll find it’s not a site you’ll want to go to in the first place.

              Are you being intentionally dense or do you not understand that it’s my point? If someone can flood lemmy with CSAM so easily that the only way to stop it is a site shutdown, then there are not sufficient mitigation measures in place.

              • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
                link
                fedilink
                arrow-up
                3
                arrow-down
                9
                ·
                1 year ago

                Are you being intentionally dense or do you not understand that it’s my point? If someone can flood lemmy with CSAM so easily that the only way to stop it is a site shutdown, then there are not sufficient mitigation measures in place.

                Yes, this is the Internet. Take your statement and replace “lemmy” with “reddit” “facebook” “9gag” “imgur” etc… No site has “sufficient mitigation measures in place” as CSAM continues to flood the internet.

                “Flooding” a site with CSAM is a matter of opinion. If one person posted one image of CSAM on my instance, that would be flooding - that’s one image too many. It’s not like there’s some magic threshold of the amount of CSAM allowed on a site. All sites use human moderators to detect CSAM and all sties who do this have teams that are far too small and far too underpaid for the most part.

                Underpaid being the keyword here, as lemmy admins are volunteers. I would think that the threshold for “flooding” a lemmy instance with CSAM would be far lower than that of a major for-profit site.

                • cubedsteaks@lemmy.today
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  1 year ago

                  Yes, this is the Internet. Take your statement and replace “lemmy” with “reddit” “facebook” “9gag” “imgur” etc… No site has “sufficient mitigation measures in place” as CSAM continues to flood the internet.

                  it’s true, if I remember correctly, tumblr was removed from the App Store because of CSA issues. I could be remembering wrong and maybe it was the Google Play Store.

                  • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
                    link
                    fedilink
                    arrow-up
                    3
                    arrow-down
                    3
                    ·
                    1 year ago

                    This is likely going to be an issue for any site that hosts nsfw content. It’s easier to have mods just ban all NSFW content without trying to go into trying to figure out if the people in the content are consenting adults.

                    If the admins of nsfw instances aren’t already on high alert, they should be now.

            • cubedsteaks@lemmy.today
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              Cool, name them and give me links then. I could not find such on the internet. There is software that tries to detect this, but even youtube’s algorithm is incorrectly flagging fully clothed 30 year old women as children.

              have you ever talked to janitors and mods on 4chan? Good luck getting any info out of them.

              • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
                link
                fedilink
                arrow-up
                1
                arrow-down
                4
                ·
                1 year ago

                Do you realize that 4chan isn’t the full internet? That these programs that you already know of can exist outside of 4chan? I’m asking you - the person who knows of these apps - to provide links to back up your claims.

                • cubedsteaks@lemmy.today
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  1 year ago

                  I’m not the other person you responded to and I never claimed to have an apps or links.

                  I’m just telling you how this works on 4chan. I’m aware that’s not the entire internet obviously - your sarcasm needs work considering we are both here on Lemmy, ie, not 4chan.

                  If anyone on there is using these programs/apps/whatever, they’re not just gonna tell other people about them.

                  And as far as I know, I haven’t been on 4chan in like 3 years not but they region ban for CSAM.

                  • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
                    link
                    fedilink
                    arrow-up
                    2
                    arrow-down
                    3
                    ·
                    1 year ago

                    I don’t really keep track of who I’m responding to. I just respond to comments. Sorry if I got mixed up here.

                    My comments still stand, though the snark isn’t directed towards you specifically.

                    I just want people to understand that there isn’t a solution that we all think exists in other places and not here. The solution is largely people. People who get PTSD from viewing and moderating these images. It’s not a good solution but it’s the best solution we have so far.

                    The other truth of the matter is if the Internet itself were to hypothetically shut down, this content will just be distributed via other means. The one nice thing about the internet is that lots of stuff is tracable back to the person who posted the infringing material.

      • cubedsteaks@lemmy.today
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I’m a victim of CSAM and my dad exploited me for several websites.

        I get being upset about this. But it’s not the end of the world for a site. Lemmy is still totally fine and I have been using it without seeing any CSAM and the only knowledge I even have of this is from posts like OP’s.

        Like this isn’t a good time to be just down on the site and pessimistic.

        • Cosmic Cleric@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          If you’re a pedophile and disagree with me - instead of downvoting, why not explain yourself?

          People have been, but you’re not truly listening, Internet Warrior.

        • cubedsteaks@lemmy.today
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          I agree with a lot of what you said and upvoted you but you really need to just stop calling people pedos for disagreeing with you.

          I’m a victim of CSAM myself and you can take a look through my comment history where I talked about it in depth more. I hate pedos just as much as you do but going around calling people pedos isn’t going to do anything but upset people.

          • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
            link
            fedilink
            arrow-up
            3
            arrow-down
            3
            ·
            1 year ago

            I’m taking the radical stance that CSAM isn’t a good thing, should be reported to law enforcement and that the site with CSAM can be shut down as a viable option for handling CSAM material.

            I’m getting downvotes from people who disagree with me on this “radical” stance. People who disagree that CSAM is a problem, that CSAM is a concern. I don’t have a lot of sympathy for people who promote CSAM like the people who downvoted my posts. I don’t care about the loss of internet points, I care that these worthless shits are still on lemmy, so yes, I call them what they are.

            • cubedsteaks@lemmy.today
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              I mean, I think people are downvoting you for other reasons.

              Obviously I agree with you that CSAM is bad. It happened to me and ruined my fucking life for like all of my teen years and then most of my early 20s.

              But calling people names is pointless. Especially when it comes off like a baseless accusation.

                • cubedsteaks@lemmy.today
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  1 year ago

                  Yeah I get that for sure. I mean, if I knew someone was some kind of MAP idiot who was trying to fight for the rights of pedos, I’d call them names too. Idiot seems fitting for that lol

            • newIdentity@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              You’re completely misinterpreting everything we said. If we would shutdown every site with CSAM, the internet wouldn’t exist. We don’t disagree that CSAM isn’t a problem. We disagree with your solution.

              • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
                link
                fedilink
                arrow-up
                3
                arrow-down
                2
                ·
                1 year ago

                You’re completely misinterpreting everything we said.

                Not at all. I am completely underestanding you.

                If we would shutdown every site with CSAM, the internet wouldn’t exist.

                You are wrong. My site doesn’t have CSAM. Lots of other sites don’t have CSAM. The internet isn’t just for CSAM. You must be smarter than this.

                We don’t disagree that CSAM isn’t a problem. We disagree with your solution.

                My solution which is to remove CSAM? My solution to turn off communities while the CSAM issue is cleaned up? What about those solutions do you disagree with?

                Another question for you: if your house is flooding due to a burst pipe, what do you do first:

                a) get all the water out of the house b) turn off the water coming into the house.

                my solution would be to do step B followed by step A. Your solution appears to be to just do step A, which means you’ll constantly be flooded and never have enough manpower to dry your house.

                I’d bet money that the following will happen:

                1. community gets turned off
                2. csam gets deleted, posters are identified, information turned over to law enforcement
                3. community gets turned back on.

                In the meantime, folks missing the community are free to go elsewhere on the internet. Why? because CSAM is a crime which depicts Sexual Assult and the evidence is posted online. It’s not a matter of just deleting content, it’s also a matter of turning over the people posting that content over to the police so they can be held accountable for their crimes.

                • newIdentity@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  You are wrong. My site doesn’t have CSAM. Lots of other sites don’t have CSAM. The internet isn’t just for CSAM. You must be smarter than this.

                  Sorry let me word this correctly: social media wouldn’t exist.

                  My solution to turn off communities while the CSAM issue is cleaned up? What about those solutions do you disagree with?

                  No your solution is to permanently shut down Lemmy since there is the possibility of being CSAM on one instance. The community it’s posted in doesn’t matter. They can just keep spamming CSAM and the mods can’t do anything about it except shutting down the instance/community. Unless there are better tools to moderate. That’s basically what everyone wants. We want better tools and more automation so the job gets easier. It’s better to have a picture removed because of CSAM that is wrongly flagged than not removing one that is CSAM.

                  The problem is that it won’t stop and that it will happen again.

                  I’d bet money that the following will happen:

                  1. community gets turned off
                  2. csam gets deleted, posters are identified, information turned over to law enforcement
                  3. community gets turned back on.

                  You’re wrong at step 2 The posters Might’ve used Tor which basically makes it impossible to identify them. Also in most cases LE doesn’t do shit. So the spamming won’t stop (unless someone other than LE does something against it). We can’t only relay on LE to do their job. We need better moderation tools.

                  Also even if the community is turned back on, what’s stopping someone from doing it again? This time maybe a whole instance?

                  It’s simply too easy to spam child porn everywhere. One instance of CP is much easier to moderate than thousands.

                  • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
                    link
                    fedilink
                    arrow-up
                    2
                    arrow-down
                    1
                    ·
                    1 year ago

                    Sorry let me word this correctly: social media wouldn’t exist.

                    And this is hardly the argument you think it is. Again, not true of all social media sites, but let’s strongman your argument for a moment and say that you are refering to only the major social media sites.

                    Well then, we have a problem, don’t we? What’s something the major social media sites have that lemmy doesn’t? Ad revenue, to the tune of millions of dollars. What do they do with that revenue? Well, some if it goes to pay real humans who’s entire job is simply seeking out and destroying CSAM content on the site.

                    So then how does lemmy, with only enough money to pay hosting costs, if that… deal with CSAM when a user wants to create a botnet that posts CSAM to lemmy instances all day? My answer is: the admins do whatever they think is nessecary, including turning off the community for a bit. They have my full support in this.

                    No your solution is to permanently shut down Lemmy since there is the possibility of being CSAM on one instance. The community it’s posted in doesn’t matter. They can just keep spamming CSAM and the mods can’t do anything about it except shutting down the instance/community. Unless there are better tools to moderate. That’s basically what everyone wants. We want better tools and more automation so the job gets easier. It’s better to have a picture removed because of CSAM that is wrongly flagged than not removing one that is CSAM.

                    You’re strawmanning my argument. I’ve never said forever. I’ve said while the community gets cleaned up. I’ve even described a timeline below.

                    The better tools you want to moderate are your own eyeballs. I’ve said this before but there have been many attempts at making automated CSAM detection material and they just don’t work as well as needed, requiring humans to intervene. These humans are paid by major social media networks but not volunteer networks.

                    The problem is that it won’t stop and will happen again.

                    Yes, this is the internet! No one has a solution to stop CSAM from happening. We aren’t discussing that. We are discussing how to handle it WHEN it happens.

                    You’re wrong at step 2 The posters Might’ve used Tor which basically makes it impossible to identify them. Also in most cases LE doesn’t do shit. So the spamming won’t stop (unless someone other than LE does something against it). We can’t only relay on LE to do their job. We need better moderation tools.

                    No, I’m correct about step 2, which I described as: “csam gets deleted, posters are identified, information turned over to law enforcement”

                    I’ll break it down further:

                    1. CSAM gets deleted from the instance. Admins and mods can do this, and they do this already.
                    2. posters are identified. Admins and mods can do this, and might do this already. TO BE CLEAR, they can identify the users by IP address and user agent, that’s about it. The rest of it… is…
                    3. “information turned over to law enformcement” … left up to law enforcement. “Hello police, I’m the owner of xyz.com and today a user at 23.43.23.22 posted CSAM on my site at this time. The user has been banned and have given you all the information we have on this. The cops can get a warrant for the ISP and go from there.

                    Oh yeah, TOR. well, we’re getting deep off topic here but go on youtube and see some defcon talks about how TOR users are identified. You may think you’re slick going on TOR but then you open up facebook or check your gmail and it’s all over.

                    Either way, I’m not speaking to the success of catching CSAM posters, I’m only speaking to what the admins likely are doing already, which is probably true.

                    Also even if the community is turned back on, what’s stopping someone from doing it again? This time maybe a whole instance?

                    Nothing, which is why social media sites dedicate teams of mods to handle this exact thing. It’s a cat and mouse game. But not playing the game and not trying to remove this content means the admins face legal trouble.

                    It’s simply too easy to spam child porn everywhere. One instance of CP is much easier to moderate than thousands.

                    This makes no sense to me. What was your point? Yes, one image is easier to delete than thousands of images. I don’t see how that plays into any of what we have been discussing though.

    • Cosmic Cleric@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      Fyi, admins have the protection of federal law to not be held responsible, as long as they take action when it happens.

      They have very low to zero legal risk, as long as they’re doing their job.

      IANAL, but I can read laws.

      • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        5
        ·
        1 year ago

        Fyi, admins have the protection of federal law to not be held responsible, as long as they take action when it happens.

        Correct, emphasis mine. As long as they take action when it happens being the key phrase here.

        IANAL but from what I understand, doing something to take action (removing content, disabling communites, banning users, all of the above) shows that they are working to remove the content. This is why previously when having conversations with people about the topic of piracy I mentioned DCMA takedown notices and how the companies I’ve worked at responded to those with extreme importance (sometimes the higher ups would walk over to the devs and make sure the content was deleted).

        I’m annoyed at people in this thread who believe that the admins did the wrong thing, because turning off communities could cause users to go to another instance - who cares, this is bigger than site engagement. I’m annoyed at people who think that the devs had access to code which could prevent this issue but chose not to implement that code - this is a larger and much more difficult problem that can’t just be coded away, it usually involves humans to verify the code is working and correct false-positives and false-negatives.

        • Cosmic Cleric@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          You misunderstood what I meant by the part that you highlighted of my comment.

          I’m speaking of Safe Harbor provisions, not having to take active DCMA actions. They’re two very different things.

          • Yes, and believe it or not, I’ve been discussing both with people.

            I use DCMA actions because they are easily understood. People get copyright strikes. People pirate music.

            Safe Harbor provisions are not as easily understood, but basically amount to (IANAL) “if the administrator removes the offending content in a reasonable amount of time when they learn about the offending content, then we’re all good”. It’s not a safe haven for illicit content, it’s more of a “well, you didn’t know so we can’t really fault you for it” sort of deal. But when admins know about the content, they need to take action.

      • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        9
        ·
        1 year ago

        You’re right, it is. You may be the sole person donating, and maybe you of all people have the “right” to have your opinion “respected” for donating. My point is that by and large, the CSAM posters and most people who use this site aren’t directly paying for a service which contractually obligates them to take part in the site or service, let alone by posting CSAM.