• @QuazarOmega@lemy.lol
    link
    fedilink
    English
    1019 days ago

    If that enables you to run AI locally, probably powered by open source models, and maybe at a fraction of the usual power consumption, then I don’t see what the issue is

    • @geemili@lemm.ee
      link
      fedilink
      218 days ago

      I do find it funny that the NPU is mentioned, while there is no mention of a GPU. I guess that is lumped in with the SoC.