• MrSulu@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    57 minutes ago

    Pedo and facist defendant Tim Sweeney. Burn his business down by disconnecting your, patronage, money, time, etc. Boards can find CEOs who are not supporters of Pedo and Facists

  • Sunsofold@lemmings.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 hours ago

    I’m no fan of banning this or that particular platform (it’s like trying to get rid of cheeseburgers by banning McDonalds; the burgers are still available from all the other burger chains and all the people who use the one will just switch to others) but this is a hilariously wrong way to get to the right answer.

  • Grass@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 hours ago

    This is almost as sus as the the specific preferred age range terminology for pedophiles that comes up now and again in the most uncomfortable of scenarios

  • criss_cross@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    5 hours ago

    The only “charitable” take I can give this is that he’s been fighting Apple and Google over store fees and the like and that he feels like if he says that Apple/Google can do this then they should be able to restrict EGS as well.

    I don’t know why CSAM AI material is the hill you’d make this point with though.

  • humanspiral@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    2 hours ago

    Zionazi oligarchist supremacism controlling media/speech promoting hate and genocide is reason to zero out his finances and media control. That bipartisan establishment loves all of this, means this performative whining over image generation tools that can be used to fake offense, is the permitted pathethic discourse establishment masquerades as democracy.

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    8 hours ago

    Yeah. I’m as tired of the argument that pretty much anything goes as far as free speech goes as I am of the “everything is a slippery slope when we make laws to keep people from doing harmful shit.”

    I mean what’s the required damage before people put a stop to inciteful speech and objectively harmful lies? Or making CSAM of kids using a platform like X? Germany had to kill a few million people before they decided that maybe displaying Nazi symbols and speech wasn’t a good idea. So we have a platform being used to make CSAM. What’s it going to take before someone says that this is a bad idea and shouldn’t be done? How many kids will commit suicide after being taunted and shamed for their images being used? How many is “enough”? There should be immediate action to end the means to use these tools to make porn, there’s plenty of porn available on the internet and making it from user-submitted images on a major public platform is a horrible idea, but too many make up all kinds of reasons why we can’t do that…economic, censorship, whatever.

  • Lvxferre [he/him]@mander.xyz
    cake
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    11 hours ago

    IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it’s fine to change them as you need. The real thing to talk about is the presence or absence of a victim.

    Non-consensual porn victimises the person being depicted, because it violates the person’s rights over their own body — including its image. Plus it’s ripe material for harassment.

    This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing.

    And it applies to children and adults. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus always victimising the children in question.

    Now, someone else mentioned Bart’s dick appears in the Simpsons movie. The key difference is that Bart is not a child, it is not even a person to begin with, it is a fictional character. There’s no victim.

    • Atomic@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      19
      ·
      10 hours ago

      That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

      Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

      There ARE victims, lots of them.

      • unexposedhazard@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        9 hours ago

        That is a lot of text for someone that couldn’t even be bothered to read a comment properly.

        Non-consensual porn victimises the person being depicted

        This is still true if the porn in question is machine-generated

          • unexposedhazard@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            7
            ·
            8 hours ago

            Which they then talk about and point out that victims are absolutely present in this case…

            If this is still too hard to understand i will simplify the sentence. They are saying:

            “The important thing to talk about is, whether there is a victim or not.”

            • Atomic@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              5
              ·
              5 hours ago

              It doesn’t matter if there’s a victim or not. It’s the depiction of CSA that is illegal.

              So no, talking about whatever or not there’s a victim is not the most important part.

              It doesn’t matter if you draw it by hand with crayons. If it’s depicting CSA it’s illegal.

                • Atomic@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  32 minutes ago

                  Talking about morals and morality is how you end up getting things like abortion banned. Because some people felt morally superior and wanted to enforce their superior morality on everyone else.

                  There’s no point in bringing it up. If you need to bring up morals to argue your point. You’ve already failed.

                  But please do enlighten me. Because personally. I don’t think there’s a moral difference between depicting “victimless” CSAM and CSAM containing a real person.

                  I think they’re both, morally, equally awful.

                  But you said there’s a major moral difference? For you maybe.

      • Lvxferre [he/him]@mander.xyz
        cake
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 hours ago

        That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

        Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

        There ARE victims, lots of them.

        You’re only rewording what I said in the third paragraph, while implying I said the opposite. And bullshitting/assuming/lying I didn’t read the text. (I did.)

        Learn to read dammit. I’m saying this shit Grok is doing is harmful, and that people ITT arguing “is this CSAM?” are missing the bloody point.

        Is this clear now?

        • Atomic@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          5
          ·
          9 hours ago

          Yes, it certainly comes across as you arguing for the opposite since you above, reiterated

          The real thing to talk about is the presence or absence of a victim.

          Which has never been an issue. It has never mattered in CSAM if it’s fictional or not. It’s the depiction that is illegal.

    • brachiosaurus@mander.xyz
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      6
      ·
      11 hours ago

      It’s called being so effective at marketing and spending so much money on it that people believe you don’t do nothing.

  • CerebralHawks@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    14
    ·
    11 hours ago

    So he’s saying CSAM is free/protected speech? Got it.

    Dude had absolutely no reason to out himself as a paedophile, but that seems to be exactly what he’s done. And for what? Epic is in legal battles to get onto other companies’ platforms (Google’s Android and Apple’s iOS) without paying the fees outlined in the terms and conditions every other developer had to agree to. I’m not saying he’s 100% wrong in that opinion, but outing himself as a paedophile by stating CSAM is protected speech only hurts his argument in the other case, because he’s saying he could put CSAM on his own platform (i.e. Fortnite, a game aimed at children and teenagers) and he’d be against you censoring his “free” and “protected” speech.

    I just see no good outcomes for what Sweeney is fighting for here.

    To address the political opponents angle, no one was calling for a Twitter/X ban before the CSAM problem, even when our political opponents were up there (i.e. before and after Trump was banned and un-banned).

  • Fedizen@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    13 hours ago

    Absolutely insane take. The reason Grok can generate CP is because it was trained on it. Musk should be arrested just for owning that shit.

    • ShaggySnacks@lemmy.myserv.one
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 hours ago

      We all live in a two tier justice system.

      The one tier is for the capital class. Generally, as long as they don’t commit crimes against the government or others in the capital class. These offenders get the slap on the wrist justice system. The Government had enough evidence between witnesses and documentary evidence from the Epstein files to atleast open investigations and charge some of the people. The only people to be arrested and charged were Epstein and Maxwell. It took a long time before either of them faced any serious consequences for their actions.

      Everyone else gets the go fuck yourself justice system.

  • boaratio@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    11 hours ago

    Tim Epic sucks, and has always sucked. Bring back Unreal Tournament you coward.

    • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      8 hours ago

      I hate that the newest Unreal Tournament just kinda… Disappeared. I mean, it’s still playable I think, just not online and aside from a week or so after it launched, I ain’t ever heard anyone talking about it. It was okay… Balance was not quite there, and it only had 2 maps when I last played it. But it had potential.

      • eletes@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 hours ago

        It had so much potential, it looked beautiful and just needed polish and some maps made for it. Could have sold it for $10-$20 easy