• @makeasnek@lemmy.ml
    link
    fedilink
    631 year ago

    Not sure why nobody in the comments is distinguishing between blocking a community on an instance (removing /c/piracy) and defederating instances (saying your users can’t subscribe to otherinstance.com/c/piracy). They are very different things. We should be very skeptical of defederation.

    Removing a community because it violates the rules of your instance is A-OK and every instance should do this. Anybody can run an instance, and anybody can set their own rules, that’s the whole idea of federation.

    De-federating other instances because you find their content objectionable is less ok. Lemmy is like e-mail. Everybody registers at gmail or office365 or myfavoriteemail.com. Every email host runs their own servers, but they all talk to each other through an open protocol. You would be pissed to find out that gmail just suddenly decided to stop accepting mail from someothermailprovider.com because a bunch of their users are pirates or tankies. Or blocked your favourite email newsletter from reaching your inbox because it had inflammatory political content.

    Allowing your users to receive e-mail, or content from subcommunities on other lemmy instances is not a legal risk like hosting the content yourself is (IANAL etc). Same way Gmail is not liable if somebody on some other e-mail server does something illegal by emailing a gmail user. That’s why you can register at torrentwebsite.com and get a user confirmation email successfully delivered to your inbox. Gmail is federated with all other e-mail services without needing to endorse them or accept legal liability for them.

    Lemmy’s strength, value, and future comes from being the largest federated space for link-sharing and other forms of communication.

    De-federation is bad.

    • silent_water [she/her]
      link
      fedilink
      English
      351 year ago

      defederation is good for nazi and CSAM instances. no one should touch either with a 10ft pole. there’s absolutely no reason to give them a larger platform.

        • silent_water [she/her]
          link
          fedilink
          English
          111 year ago

          “CSAM instances” <– Pretty sure any publicly facing instances with this problem would be tackled by law enforcement pretty quickly.

          as far as I’ve heard, they’re still up and major instances are still federated with them.

          “Nazi instances”< – These ones will likely de-federate themselves from the wider federated web, they can’t handle a broad range of perspectives well.

          this is a deep misunderstanding of how far-right groups operate. they actively seek connection with the wider community because it presents them a chance to recruit and they’re numbers get decimated when they’re deplatformed. offering them a base of users to proselytize to only benefits them.

          Social media has enabled these groups to both silo themselves and get promoted to users site-wide

          yes precisely

          This method of content promotion is responsible for the explosion of online hate content in the last decade

          this has a deeper material reason underlying it. it’s got more to do with economic decay and the lack of prospects people face than the algorithms. we saw the same thing early last century. far-right ideology explodes in popularity when the left fails to make the case for a more equitable distribution of resources and because our oligarchs fund them to an obscene degree – minor fascists with a hundred followers on social media will receive hundreds of thousands of dollars in funding (cf Ali Alexander). fascist ideology spreads because it poses scapegoats for the problems in society.

          Nazis had plenty of websites in the 90 and early 2000’s but they didn’t get much traction with them because Facebook wasn’t forcing them into your home feed

          yes, precisely. if normal instances federate with the nazi ones, this won’t be true any longer because their content WILL flood the feeds of many people. this will have disastrous consequences for lemmy as a platform.

          I really don’t have a problem with these sites existing, people should be free to have their own disgusting racist thoughts and share them with their own little chat rooms and forums and the like.

          I do as me and mine belong to groups they target. if they’re allowed to rise to accumulate any power, it will spell death for us. there have already been multiple attempts in the US to organize pograms against trans people, as an example.

          And they should be ruthlessly mocked and kicked out of every other space they could possibly go to.

          inshallah

          however, I’d like to point out that 4chan originally started making memes to mock the fascists – their use of irony turned over time into unironic fascism and they became a hotbed for neo-fascists.

          Again, using the e-mail example, I can get an email from whitepowerwebsite as a gmail user. That’s not google giving them a platform, it’s just a neutral protocol for online communication (e-mail) working in a federated state as it’s meant to

          email is a bad example because it only provides point-to-point communication, unless you join a mailing list. social media is different – views get broadcast to the wider public on a given platform. federating with nazis allows them to broadcast their views and create a sense that their vision of the world is actually what everyone else believes. exploding-heads is federated with lemmy.world and the consequence is that many users have left lemmy.world specifically to get away from the fascists dumping their disgusting worldview onto the platform.

          Gmail isn’t expected to police the entirety of e-mail, the legal liabilities lie with the sender and receiver.

          they actually do have liability under laws like the DMCA, SESTA/FOSTA, and the new slate of laws recently passed to go after sex traffickers (and in reality a wide host of “undesirable” content more generally). but that aside, I’m not talking about legal liability. I’m talking about the responsibility the people running these instances have to not help build fascism. it’s an ethical/political responsibility, not a legal one.

          • @makeasnek@lemmy.ml
            link
            fedilink
            21 year ago

            You are right about worsening economic conditions leading to the rise of far right movements. I was more speaking to their digital footprint. If you remember early Facebook, it was nothing like what people use today.

            yes, precisely. if normal instances federate with the nazi ones, this won’t be true any longer because their content WILL flood the feeds of many people. this will have disastrous consequences for lemmy as a platform.

            If lemmy A is federated w lemmy B (the nazi one), it means:

            • Users on Lemmy A can subscribe to communities and users on Lemmy B and vice versa
            • Users on Lemmy A can comment on communities on Lemmy B and vice versa

            It does not mean:

            • Posts from lemmy B show up on Lemmy A (except in the “global” view on main page, which is non-default, and likely won’t show up their either due to massive downvoting). I would imagine, in time, that the global tab actually gets entirely removed since you have a problem where a single lemmy instance can massively inflate their vote count to make their votes the top voted posts across the whole network. You can’t enforce instances to follow the rules on this and you can’t audit their compliance. There are certainly some solutions to this involving blockchain but that’s an aside and those are at least a few years away afaik. 90% of users never do the “non-default” option in whatever app they’re in.

            So this flooding the feeds scenario, I just don’t see it. In user-moderated platforms, vocal minorities don’t show up anywhere, they get moderated out basically automatically except in their own little enclaves. There is no scenario in which Lemmy as a federation provides a good platform for them (outside of their own nazi-friendly instance), because Lemmy doesn’t work like other social media works.

            • silent_water [she/her]
              link
              fedilink
              English
              81 year ago

              In user-moderated platforms, vocal minorities don’t show up anywhere, they get moderated out basically automatically except in their own little enclaves.

              I will take this to mean communists make up a soft majority on lemmy given the number of complaints about commie posting keep popping up on the major comms lenin-laugh

    • @jellyka@lemmy.ca
      link
      fedilink
      191 year ago

      While I agree with you, I’d really love the possibility of block whole instances, just for me. I don’t want my instance from defederating from much, but I’d like for example to block all the porn without having to find myself some christian lemmy instance to move to lol

    • @lukini@beehaw.org
      link
      fedilink
      151 year ago

      Nah I gotta disagree on this one. I specifically joined this instance as a welcoming space. I’m glad we’re defederated from the tankie and far right instances. I want none of that here. You can feel differently for the “main” instances or whatever you want to call them, but for me, defederation is amazing.

    • @Spedwell@lemmy.world
      link
      fedilink
      1
      edit-2
      1 year ago

      If an instance is merely blocked, does that mean all content produced by that instance, or by a Lemmy.World user using that instance, is strictly not stored on Lemmy.World servers?

      Otherwise there might still be liability. Also, in the US you don’t even have to do anything illegal to be the target of a lawsuit—distancing from piracy is a practical defense against the cost of legal proceedings, even if it’s technically legal.

    • TheSpookiestUser
      link
      fedilink
      11 year ago

      If someone’s email domain is @ihateminorities.com, I’d say that’s pretty fair grounds for blocking it.

      There are some instances that actively promote hateful or extremist content, and exist for the purpose of hosting it. There are others that do not actively support that content but do allow it, anywhere, making blocking one community not enough. Defederation is an important tool and should be used wisely.