• pjhenry1216
    link
    fedilink
    41 year ago

    It’s not replacing jobs well. It’s being exploited because the subpar work is passable. As long as it’s not in a monopoly industry, real humans will always outdo the cheap knockoff services.

    CharGPT is not AI. AI has been bastardized and is being used incorrectly. If someone is selling a service and use the term “AI”, do not trust them.

    AI doesn’t exist.

    LLMs are not the same thing as AI.

    LLMs cannot create anything new.

    They are also confidently incorrect all the time because they hold no concept or context of the situation. It’s just predicting words that are most likely to follow the prompts you give it based on all the combination of words it “knows”. The issue here is that it won’t know if it’s answering incorrectly or not. It’ll be confident regardless.

    AI will be exploited. Just as the cloud was exploited when companies thought it meant they didn’t need IT staff anymore. Admins are still needed.

    • @Hanabie@sh.itjust.works
      link
      fedilink
      6
      edit-2
      1 year ago

      Let’s skip over all the linguistic quirks here really quick and focus on the heart of the matter.

      AI is good enough at certain things to change whole job sectors. Whether that’s good or not is not something I can even discuss without making assumptions based on lacking data. What it realistically means, though, is that certain jobs are being transformed, while others are becoming superfluous. How we as society deal with this is one of the challenges in the coming years.

      AI has been making huge strides in the recent years, months, even weeks. Heck, you can’t spend a week in the woods without missing some big news. The challenge is to adapt to the new tools without making unequal wealth distribution even worse than it is already.

      That doesn’t change the fact that certain jobs will change, like translation turning more info editing work, or coding into designing, and some older folks, who find it difficult to adapt, will go under.

      It’s happened before multiple times, for example with the industrial revolution.

    • @SirGolan
      link
      21 year ago

      I think you’re conflating AGI (artificial general intelligence) with AI here amongst other misconceptions.

      Yes, transformer LLMs are trained to predict the next word, but larger ones (like GPT3) exhibit emergent abilities that nobody really predicted.

      I’m curious what you think something new might be. I had GPT4 write a whole bunch of code lately to fit into existing systems I created. I guarantee no systems like that were in its training data because it’s a system that deals with GPT4 and LLM functionality that didn’t exist when the training data was collected. One of my first experiments with GPT3 was an app that could make video game pitches. I can guarantee some of the weird things my team made with that were new ideas.

      Does it really understand anything? Who knows. Does it matter if it can act like it does? See also the Chinese room experiment.

      • pjhenry1216
        link
        fedilink
        31 year ago

        Coding is a poor example. It’s a language. It’s simply translating from one language (pseudocode) to another (the programming language you requested). As long as you give it clear instructions, it’s not “solving” anything. It’s like saying Google translate created something new because you asked it to translate a sentence no one has asked before.

        Honestly, I don’t think there’s as significant “emergent” capabilities beyond it just being better at performing than they expected.

        • @SirGolan
          link
          3
          edit-2
          1 year ago

          I suppose that’s my bad for the article I linked which doesn’t really go into specifics on what the capabilities are. One of the big ones is tool use. You can give it a task and a list of tools to use and it can use the tools to compete the task. this capability alone makes a huge amount of automations possible that weren’t possible before LLMs.

          I’m getting the impression your definition of “new” is “something only a human could come up with.” Please correct me if I’m wrong here. People who create completely novel things are few and far between. They’re typically the ones remembered for centuries. Though honestly, even then they’re usually standing on the shoulders of those before them. Just like what AI does. Look at AlphaFold, an AI that is rapidly accellerating disease research and solving many other hard problems.

          Anyway, if I can prompt the AI to write code for me and even if you don’t count that as something new, it’s a force multiplier on my job, which is a huge benefit. As Hanabie said, there’s going to be a lot of changes in jobs due to AI and those who don’t adapt are going to be left behind. I’m commenting here in hopes of helping people see that and not get left behind.

          • pjhenry1216
            link
            fedilink
            21 year ago

            You realize you essentially just argued my point though. That’s basically my analogy with the cloud. It’s not replacing anything. I could have been clearer I suppose, but the crux of it is that it’s not replacement.

            • @SirGolan
              link
              21 year ago

              Oh hmm. Are you just saying that it can’t fully replace people at jobs? Because I generally do agree with that at least with current models and methods of using them. It’s getting close though, and I think within a year or two we will be there for at least a bunch of professions. But on the other hand, if it makes workers in some jobs 2x more productive then the company only needs to keep half of those workers to maintain the same output. I think this is where it’s going to start / has already started.