• some_guy
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    3 days ago

    I only eff around with it occasionally. I run it on a MacBook Pro M1 Max. It’s solid for performance. I don’t have a job where I can employ it regularly, so after initial testing, I barely use it.

    • BeardedGingerWonder@feddit.uk
      link
      fedilink
      arrow-up
      1
      ·
      3 days ago

      Fair, I’m kinda wondering about having a general local household ai, I’ve got no good reason for it other than general tinkering. I’m somewhat waiting for the crossover between decent ai and affordable hardware to occur.

      • danzania@infosec.pub
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        I’ve been running Gemma3 4b locally on ollama and it’s useful. I’m thinking about applications where a multimodal model could receive video or sensor feeds (like a security can, say).