• sp3ctr4l@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      Sorry, I’m not entirely sure what you mean.

      Did you mean to say:

      “And need to have the best consumer GPU on the market, to run an LLM.”

      … likely alluding to an RTX 5090?

      So you would be saying that basically it is bullshit, the idea that everyone needs extremely expensive hardware, to run an LLM?

      • Diurnambule@jlai.lu
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        Hello, no sorry auto correction and going fast do it to my posts. I wanted to say that NVIDIA is already the worst option for consumer graphic card since AMD made a card with 20go ram which is able to run most open weight models.