Today I posted a picture of a stamp with an animal in it and they said the picture contained nudity and made me take it down, but I reported a photo of a guy with a fully visible swastika tattoo and they said that’s fine.

I’d like to start a Lemmy community with photos of stuff that they refuse to remove called FacebookSaysItsFine.

    • @dustyData@lemmy.world
      link
      fedilink
      1210 months ago

      I would suggest the opposite. Perhaps a Facebook doesn’t allow this, community. Too much risk of attracting trolls and monsters.

      That said. The FBI says that Facebook, Instagram, Twitter et al. They all contain a not insignificant amount of CSAM at any given point in time. The fact just never gets reported by press because they’re normalized platforms by the public. Only the fediverse gets that sort of negative attention in the press because it’s the disruptive outsider platform. When by both proportion and volume, almost all other platforms have a worse issue with awful content that regularly flies under the radar because they are big corporations.

    • ThePowerOfGeek
      link
      fedilink
      810 months ago

      It would definitely require some very active moderation and clearly-defined community rules. But it sounds like a great idea for a Lemmy community, if you have the time.

        • @Rai@lemmy.dbzer0.com
          link
          fedilink
          110 months ago

          When did “CP” become “CSAM”?

          If you want to change the acronym, wouldn’t “CR” make more sense?

          • @Cracks_InTheWalls@sh.itjust.works
            link
            fedilink
            3
            edit-2
            10 months ago

            'cause porn is made with consenting adults. CSAM isn’t porn. CR is typically what’s depicted in CSAM (assuming that R stands for rape), but there’s two (or more) separate though closely related crimes here. That and SA (sexual assault) covers a wider breadth of activities, which is good if a person wants to quibble over the term rape when regardless something horrific happened to a kid and videos/images of said horrific thing is now getting shared among pedophiles.

            Will note I’ve only seen CSAM used when I started using Lemmy, so I’m not really sure when people started using the term over CP. I’m personally for it - it more accurately describes what it is, and while I haven’t seen this term in the wild SAM to describe video or images of non-consenual sex acts among adults is good too.

    • @csgraves@lemmy.world
      link
      fedilink
      610 months ago

      I worry about this on fediverse stuff. I made the mistake of looking at the links from a person who commented on anti trans legislation and let me just say yikes!

      The link was to something trying to legitimize the identity of “map.”

      NOPE.

      I deleted my comments and blocked the sick bastard.

      • @BonesOfTheMoon@lemmy.worldOP
        link
        fedilink
        710 months ago

        I blocked some instance that was all porn and that seemed to improve my experience. I’m not against porn, I just don’t care for it myself.

        • @railsdev@programming.dev
          link
          fedilink
          310 months ago

          Is it just me or is Lemmy the wrong platform to just dump a bunch of porn?

          It creeps me out when I go to talk about programming and stuff then just random porn pops out of nowhere. Like, is social media really the right place for that?

    • @Sylver@lemmy.world
      link
      fedilink
      5
      edit-2
      10 months ago

      Try to stay apolitical and you won’t attract those trolls as early. Which I now realize may be difficult, considering many of the posts would be calling out Nazi scum…