• AtHeartEngineer@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    24 hours ago

    If AMD was smart they would release an upper-mid range card with like 40+ gb of vram. Doesn’t event have to be their high end card, people wanting to do local/self serve AI stuff would swarm on those.

    • foggenbooty@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      35 minutes ago

      Please just give us self hosting nerds SR-IOV on affordable cards. I really want to have a Linux VM and Windows VM that both have access to a GPU simultaneously.

      I was hoping Intel would let some of these enterprise locked features trickle down as a value add, but no dice. Every year AMD just undercuts NVIDIA by a small amount, but it doesn’t compete on some of that tech NVIDIA has so it’s a wash.

      But they’re too concerned it would eat into their enterprise cards where they make boatloads, so it’s not going to happen. Imagine if consumer CPUs didn’t support virtualization, it would be insane and that’s where we are with GPUs today.

    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      22 hours ago

      yeah, I’ve been wanting a card like that to run local models since 2020 when I got a 3080. Back then I’d have spent a bit more to get one with the same performance but some 20GB of VRAM.

      Nowadays, if they released an RX 9070 with at least 24GB at a price between the 16GB model and an RTX 5080 (also 16GB); that would be neat.

      • AtHeartEngineer@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        18 hours ago

        Same, I’ve got a modded 2080ti with 22gb of vram running deepseek 32b and it’s great… But it’s an old card, and with it being modded idk what the life expectancy is.