• NotSteve_@piefed.ca
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      20 hours ago

      Fuck reddit but going after (moderated) websites for user uploaded content isn’t a great path. In this case, maybe they could go after reddit though if they could prove reddit intentionally allowed the content though

      • pelespirit@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        6
        ·
        20 hours ago

        They’re responsible for what gets posted. They’re now a billion dollar company and probably have teams of lawyers. They’re also letting the user that gives them free content to make money of of, take the hit when it’s deemed illegal.

        • ranzispa@mander.xyz
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          19 hours ago

          Whether they are responsible for what gets posted is exactly the discussion point. There are different ideas on this. Different countries and different people have different opinions on this point.

          Generally, website are not really in favor of this idea - and it’s not only big tech I’m talking about.

          My personal opinion is I don’t really like this idea. Having the websites responsible of what gets posted means the websites necessarily have to do some censoring. I’m not necessarily against censoring, but I don’t like the idea it is a large private company deciding what to censor. I’d much rather have the government decide and impose the ban on companies.

          Moreover, forcing websites to censor things leads to a very centralised internet. The random guy setting up a forum can not afford to patrol that well how website, while big companies indeed can have teams of people doing just that.

          • JackbyDev@programming.dev
            link
            fedilink
            English
            arrow-up
            3
            ·
            17 hours ago

            My understanding is that this is partly why the DMCA works the way it does. If the site takes content down then they’re protected from claims made by the copyright holder. It’s a way to acknowledge “hey, you’re sort of responsible for what gets put up, but we know mistakes happen.” (Though, more often than not, it’s just used as a way to silence people. Not saying DMCA is perfect or good.)

          • Ashtear@piefed.social
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            16 hours ago

            I don’t think it’s necessarily true that it’s so overly onerous that it must lead to centralization. I’m part of staff on a medium-sized Discord server and we have more than enough coverage to handle objectionable content. As another example, Fediverse instances here have proactively established rules and norms for NSFW content that ensures the communities keep running, and most are still going a few years later after the Reddit exodus exploded their populations.

            It absolutely does make scaling up more expensive, but I’ve gone from a fairly libertarian stance on this to now asserting proper community moderation has become part of the social responsibility corps have now when making these spaces grow to have massive reach. And yes, I don’t think big corps do enough on this topic, and it’s another inequity because it’s really starting to look like new organizations are going to have to be more responsible for what content they allow. But I’m all for coming down hard on the big corps. Everyone got by just fine in the 90’s and 2000’s when they had much, much larger customer support/moderator staffs. “It’s too expensive” is the same garbage excuse used for other forms of enshittification today while these platforms make money hand over fist.

              • Ashtear@piefed.social
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 hours ago

                In terms of content moderation? Not in the least. Discord is extremely laissez-faire about it. They intervene when compelled to by law enforcement. I can’t speak to the large server experience, but we’ve run our server for nearly six years and not a single member of staff has ever spoken with anyone at Discord. Every single one of the moderation tools we use are third party.

                As a practical matter, Discord provides the communication infrastructure for us and that’s literally it. Irrelevant to the topic at hand. We could pack up and move it all to Matrix tomorrow and our content moderation experience would be just as centralized (that is to say, it would not be).

                • The Octonaut@mander.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  edit-2
                  7 hours ago

                  You don’t understand the concepts of liability or of server-side administration and monitoring, which are two very diverse areas to speak so confidently incorrectly about

          • pelespirit@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            5
            ·
            19 hours ago

            So you’re saying that if a mod posted child porn, that the website should leave it up and let the chips fall where they may? That the website isn’t responsible if it stays up?

            • ranzispa@mander.xyz
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              2
              ·
              19 hours ago

              What I am saying is that it should not be up to a website company to decide whether something is legal or not. In all other businesses this has always been related to a judge deciding whether something was legal or not. A newspaper is something related, in which case the editor has some responsibility if he lets something clearly illegal slip, however the responsibility falls on the journalist and not on the newspaper itself.

              Frankly, I do not want social media - which is currently the main source of information for many and likely most people - to be justified in deciding what should be allowed and what should not. If someone uses such platforms to do something illegal, there are indeed legal methodologies to deal with that.