• some_guy
    link
    fedilink
    arrow-up
    12
    arrow-down
    3
    ·
    5 days ago

    That’s the second time in three days that I’ve seen an article where “AI” (machine learning) was actually useful. It’s a hype machine and it’s overvalued, but it’s nice to see it being useful. I still can’t wait for OpenAI to fail. I run the Llama model locally because to hell with giving corps more of my data. Anyway…

      • some_guy
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        4 days ago

        I only eff around with it occasionally. I run it on a MacBook Pro M1 Max. It’s solid for performance. I don’t have a job where I can employ it regularly, so after initial testing, I barely use it.

        • BeardedGingerWonder@feddit.uk
          link
          fedilink
          arrow-up
          1
          ·
          4 days ago

          Fair, I’m kinda wondering about having a general local household ai, I’ve got no good reason for it other than general tinkering. I’m somewhat waiting for the crossover between decent ai and affordable hardware to occur.

          • danzania@infosec.pub
            link
            fedilink
            arrow-up
            2
            ·
            3 days ago

            I’ve been running Gemma3 4b locally on ollama and it’s useful. I’m thinking about applications where a multimodal model could receive video or sensor feeds (like a security can, say).