• @kitnaht@lemmy.world
    link
    fedilink
    English
    324 days ago

    I’ve found that 4o is substantially worse than the previous model at a ton of things. So I run all of my LLMs locally now through OLLAMA.