• entropicdrift
    link
    fedilink
    arrow-up
    13
    ·
    2 months ago

    Yeah, if they were just running it locally off a GPU it would be cooler

    • psud@aussie.zone
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 months ago

      Running an LLM isn’t expensive whether locally or in the cloud, all the cost is in the training.