• SuperFola
    link
    fedilink
    English
    323 months ago

    How come the hallucinating ghost in the machine is generating code so bad the production servers hallucinate even harder and crash?

    • @henfredemars@infosec.pub
      link
      fedilink
      English
      213 months ago

      I’m not sure how AI supposed to understand code. Most of the code out there is garbage. Even most of the working code out there in the world today is garbage.

      • SuperFola
        link
        fedilink
        English
        103 months ago

        Heck, I sometimes can’t understand my own code. And this AI thing tries to tell me I should move this code over there and do this and that and then poof it doesn’t compile anymore. The thing is even more clueless than me.

        • Elvith Ma'for
          link
          fedilink
          English
          83 months ago

          Randomly rearranging non working code one doesn’t understand… sometimes gets working code, sometimes doesn’t fix the bug, sometimes it won’t even compile anymore? Has no clue what the problem is and only solves it randomly by accident?

          Sounds like the LLM is as capable as me /s

                • @sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  7
                  edit-2
                  3 months ago

                  My boss comes to me saying we must finish feature X by date Y or else.

                  Me:

                  We’re literally in this mess right now. Basically, product team set out some goals for the year, and we pointed out early on that feature X is going to have a ton of issues. Halfway through the year, my boss (the director) tells the product team we need to start feature X immediately or it’s going to have risk of missing the EOY goals. Product team gets all the pre-reqs finished about 2 months before EOY (our “year” ends this month), and surprise surprise, there are tons of issues and we’re likely to miss the deadline. Product team is freaking out about their bonuses, whereas I’m chuckling in the corner pointing to the multiple times we told them it’s going to have issues.

                  There’s a reason you hire senior engineers, and it’s not to wave a magic wand and fix all the issues at the last minute, it’s to tell you your expectations are unreasonable. The process should be:

                  1. product team lists requirements
                  2. some software dev gives a reasonable estimate
                  3. senior dev chuckles and doubles it
                  4. director chuckles and adds 25% or so to the estimate
                  5. if product team doesn’t like the estimate, return to 1
                  6. we release somewhere between 3 and 4

                  If you skip some of those steps, you’re going to have a bad time.

                  • @henfredemars@infosec.pub
                    link
                    fedilink
                    English
                    3
                    edit-2
                    3 months ago

                    In my experience, the job of a sr. revolves around expectations. Expectations of yourself, of the customer, of your bosses, of your juniors and individual contributors working with you or that you’re tasking. Managing the expectations and understanding how these things go to protect your guys and gals and trying to save management from poking out their own eyes.

                    And you may actually have time to do some programming.

      • @sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        73 months ago

        Can confirm. At our company, we have a tech debt budget, which is really awesome since we can fix the worst of the problems. However, we generate tech debt faster than we can fix it. Adding AI to the mix would just make tech debt even faster, because instead of senior devs reviewing junior dev code, we’d have junior devs reviewing AI code…

      • Karyoplasma
        link
        fedilink
        English
        63 months ago

        LLMs are not supposed to understand, they are supposed to pretend to understand.