• 667
    link
    fedilink
    451 year ago

    I prompted ChatGPT to write and adjust a linear Python script for a repetitive task I needed to automate. It took 30 minutes versus the 6-12 hours I would have consumed if I had coded it myself.

    It’s a huge force multiplier when used properly.

    • @phoneymouse@lemmy.world
      link
      fedilink
      English
      131 year ago

      It can speed up the process, but it’s not like it would replace a programmer. It still requires someone with enough knowledge to check it’s output and correct it’s mistakes or call it’s bullshit.

      • SokathHisEyesOpen
        link
        fedilink
        English
        91 year ago

        It won’t replace us yet. This is the first technology over my entire career that has me a little concerned about the future.

        • @ourob@discuss.tchncs.de
          link
          fedilink
          English
          7
          edit-2
          1 year ago

          I don’t know. The speed that these things blew up in becoming The Next Big Thing™️ kind of sets off my bullshit detectors.

          I’m certainly not an expert in machine learning topics, but I suspect that the output of LLMs will never be able to output complex code that doesn’t require a lot of modification and verification.

          • @acceptable_pumpkin@lemmy.world
            link
            fedilink
            English
            11 year ago

            While it may not eliminate positions entirely, it will greatly reduce the number of positions needed.

            See any advancements in automation from farming to manufacturing.

            • @ourob@discuss.tchncs.de
              link
              fedilink
              English
              31 year ago

              See any advancements in automation from farming to manufacturing.

              See, this is the kind of thing that makes my bullshit detectors go off. The comparison elevates this new tech to the same level of importance as past revolutionary shifts in industry. But this only seems justified if you can assume the rapid advancements in LLMs will continue at the same rate going forward, which not a given at all. Fundamentally, these models are trained to produce convincing output, not accurate output. There is no guarantee that high accuracy will be achieved with this approach.

              For programming, I don’t see these LLMs any differently than previous advancements in tooling and in high level programming languages and frameworks. They will make it easier to rapidly prototype and deploy (shoddy) apps, but they will not be replacing devs who work at a low level high performance, or critical areas, nor will they be drastically reducing the workforce needed - at least not any more than other tooling advancements.

              All just my opinion, of course. We shall see.

        • @Meowoem@sh.itjust.works
          link
          fedilink
          English
          11 year ago

          Exactly, a year or two ago I said that knowledge of obscure and obsolete languages won’t be as saleable a skill soon because of the ability to convert code automatically into a more widely used language, everyone laughed at me and said that will never happen - already some big companies have started doing it.

          I was talking to a friend recently about AI coding and realised that beyond a certain point a huge portion of the industry will be made obsolete entirely and honestly it’s probably not very far away - pretty much all the coding either of us had worked on won’t be needed if you can simply ask your computer to do it without needing a special program.

          I’ve created a lot of GUI tools for example and tools for configuration but being able to just talk to you computer would erase the need for almost all of them, and a lot of stuff you won’t even need to do in the first place - why would I install an app to monitor network connectivity and locate newly added devices when I can just say ‘computer, how’s the network been working today? Is my printer working?’ and it just tells you.

          How we interact with computers has done nothing but change, I really think we’re going to see a real game change soon, like not a game changing move, literally switching from chess to buckaroo.

          • @Couldbealeotard@lemmy.world
            link
            fedilink
            English
            11 year ago

            Those two examples you’ve given (asking the computer about the network and printer) don’t need ai (LLMs in this context) to exist. They need to be pre programmed absolute functions. Suggesting that these LLMs are a step towards that not only ignores that we already have voice assistants built into computers, but ignores the fact that LLM outputs are volatile and can’t be trusted.

            • @Meowoem@sh.itjust.works
              link
              fedilink
              English
              01 year ago

              What I’m getting at is you won’t need absolute functions to pre exist when you can just ask your computer and it’s able to poll the relevant resources and format a reply, of course current models can’t do this but if you think that history ended and there will be no more developments in AI then you’re not being serious.

              LLMs have the spotlight at the moment because natural language has been a Holy Grail of AI research for a long time but all the other types of models are amazing at other things, it’s only a matter of time before the various pieces are combined to make some really useful and powerful tools

    • TornadoRex
      link
      fedilink
      English
      61 year ago

      I used to do some coding in high school and early college. I’ve since moved on to other things but it’s fun to have ChatGPT write me a little python script or something and debug my way through it.