• Captain Poofter
    link
    fedilink
    English
    -4
    edit-2
    4 months ago

    In actual computer science you talk about AI all the time as well but it’s not actually intelligent is it? It’s just SmarterChild 2.0 and literally has no idea what word it said just before it’s current one. Not intelligent. Words are often used inappropriately. The only thing computers can consume is data and electricity by definition, and consuming data is not the same as implementing it in a language (or visual) model that you intend to profit from. This is data theft, unless properly licensed.

    • @areyouevenreal@lemm.ee
      link
      fedilink
      English
      4
      edit-2
      4 months ago

      How intelligent it is or isn’t is irrelevant. We talk about much dumber programs than AI as being consumers of files and data including things like compilers. Would it not be person use for you to view a picture in a photo viewer or try and edit it in GIMP?

      It’s not data theft at all unless the courts and law says it is. Ranting on lemmy won’t change that fact. Theft is a construct of law.

      You can add clauses against use as AI training data to your licence if you wish.

      • Captain Poofter
        link
        fedilink
        English
        -2
        edit-2
        4 months ago

        You can try to equate humans to computers all day, and you can even pass laws that says they’re the same thing. That does not make it true. A company using software to profit off data they have not licensed (whether it’s public or not does not matter! That is not how copyright law works!) is theft.

        Please try to sell DVDs of markiplier’s publicaly available YouTube content and tell people how you’re allowed to because it’s publicaly available.

        • @areyouevenreal@lemm.ee
          link
          fedilink
          English
          64 months ago

          I am not equating humans with computers. These businesses are not selling people’s data when doing AI training (unlike actual data brokers). You can’t say something AI generated is a clone of the original anymore than you can say parody is.

          • Captain Poofter
            link
            fedilink
            English
            0
            edit-2
            4 months ago

            I absolutely can. Parody is an art form, which is something that can exclusively only be created by human beings. AI is an art laundering service. Not an artist.

            The law should reflect that these companies need to be first granted permission to use datasets by the rights holders, and creative commons licenses need to be given an opportunity to opt out of being crawled for these datasets. Anything else is wrong. Machines are not humans. Creative common copyright law was not written with the concept of machines being “consumers”. These companies took advantage of the sudden emergence of these models and the delay of law in holding their hunger for data in check. They need to be held accountable for their theft.

            • @areyouevenreal@lemm.ee
              link
              fedilink
              English
              4
              edit-2
              4 months ago

              There are already anti-AI licenses out there. If you didn’t license your stuff with that in mind that’s on you. Deep learning models have been around for a lot longer than GPT 3 or anything that’s happened in the current news cycle. They have needed training data for that long too. It was predictable stuff like this would happen eventually, and if you didn’t notice in time it’s because you haven’t been paying attention.

              You don’t get to dictate what’s right and wrong. As far as I am concerned all copyright is wrong and dumb, but the law is what the law is. Obviously not everyone shares my opinion and not everyone shares yours.

              Whether an artist is involved or not it’s still a transformative use.