New Mexico is seeking an injunction to permanently block Snap from practices allegedly harming kids. That includes a halt on advertising Snapchat as “more private” or “less permanent” due to the alleged “core design problem” and “inherent danger” of Snap’s disappearing messages. The state’s complaint noted that the FBI has said that “Snapchat is the preferred app by criminals because its design features provide a false sense of security to the victim that their photos will disappear and not be screenshotted.”

  • @some_guyOP
    link
    28 days ago

    The article didn’t say the cops generated AI CSAM. It said they created a profile pic, which was shown in the article.

    • So if someone generates a minor’s image and it’s not nude, is that not CSAM?

      I’m genuinely asking, I always thought it was about sexualizing children, not whether they are nude or not.

      • @some_guyOP
        link
        28 days ago

        I don’t think so. People keep throwing that acronym around but I suspect they didn’t read the article and find out that it was one normal picture of a high school-aged girl.

        • @Grandwolf319@sh.itjust.works
          link
          fedilink
          1
          edit-2
          7 days ago

          I actually read it and then made a comment because even though it’s a profile picture, the intent is to have a viewer sexual the picture and thereby sexualizing a minor.

          I do get how it’s a normal picture, but it made me think of this slippery slope and where the line is.