Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • JBloodthorn
    link
    fedilink
    1611 months ago

    The content in question is unfortunately something that has become very common in recent months: CSAM (child sexual abuse material), generally AI-generated.

    AI is now apparently generating entire children, abusing them, and uploading video of it.

    Or, they are counting “CSAM-like” images as CSAM.

    • @docrobot
      link
      1411 months ago

      Of course they’re counting “CSAM-like” in the stats, otherwise they wouldn’t have any stats at all. In any case, they don’t really care about child abuse at all. They care about a platform existing that they haven’t been able to wrap their slimy tentacles around yet.