• @jonne@infosec.pub
      link
      fedilink
      105 months ago

      I think training your own image generator on existing child porn is probably beyond most high schoolers. I’d be happy if at least commercial options were held responsible for distributing generated CP, which is already illegal BTW.

      • @cygnus@lemmy.ca
        link
        fedilink
        145 months ago

        I don’t think the models are trained on CP. They’re likely trained on widely-available porn.

        • @zurohki@aussie.zone
          link
          fedilink
          English
          95 months ago

          This. If you ask an image generator for a bed in the shape of a pineapple, it probably has no pineapple-shaped beds in its training data but it has pineapples and beds and can mash the concepts together.

        • Onihikage
          link
          fedilink
          English
          15 months ago

          Technically, any model trained on LAION-5B before December 2023 was trained on CSAM.

          But yeah, I expect any porn model trained on a sufficient diversity of adult actors could be used to make convincing CP even without having it in the training data. AI image generation is basically the digital equivalent of a chainsaw - a tool for a particular messy job that can really hurt people if used incorrectly. You wouldn’t let a typical kid run around unattended with one, that’s for sure.

          • @cygnus@lemmy.ca
            link
            fedilink
            25 months ago

            I expect any porn model trained on a sufficient diversity of adult actors could be used to make convincing CP even without having it in the training data.

            I know I’m wading into the danger zone here, but let’s also remember we’re talking about teenagers. A (for example) 15 year old’s body type will be closer to an 18 year old’s than a 5 year old’s, so the perfectly legal porn model would work just fine for that, uh, purpose.