• Merlin@lemm.ee
    link
    fedilink
    English
    arrow-up
    40
    ·
    1 day ago

    The consumer GPU market is becoming a dystopia at the top end. AMD has publicly retreated from it and Intel is likely a decade away from competing there. I guess I’ll stay in the midrange moving forward. Fuck Nvidia.

    😞

    • AtHeartEngineer@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      1 day ago

      If AMD was smart they would release an upper-mid range card with like 40+ gb of vram. Doesn’t event have to be their high end card, people wanting to do local/self serve AI stuff would swarm on those.

      • foggenbooty@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        Please just give us self hosting nerds SR-IOV on affordable cards. I really want to have a Linux VM and Windows VM that both have access to a GPU simultaneously.

        I was hoping Intel would let some of these enterprise locked features trickle down as a value add, but no dice. Every year AMD just undercuts NVIDIA by a small amount, but it doesn’t compete on some of that tech NVIDIA has so it’s a wash.

        But they’re too concerned it would eat into their enterprise cards where they make boatloads, so it’s not going to happen. Imagine if consumer CPUs didn’t support virtualization, it would be insane and that’s where we are with GPUs today.

      • Eager Eagle@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        23 hours ago

        yeah, I’ve been wanting a card like that to run local models since 2020 when I got a 3080. Back then I’d have spent a bit more to get one with the same performance but some 20GB of VRAM.

        Nowadays, if they released an RX 9070 with at least 24GB at a price between the 16GB model and an RTX 5080 (also 16GB); that would be neat.

        • AtHeartEngineer@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          20 hours ago

          Same, I’ve got a modded 2080ti with 22gb of vram running deepseek 32b and it’s great… But it’s an old card, and with it being modded idk what the life expectancy is.

    • Diplomjodler@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      1 day ago

      I don’t get why people are so keen on handing over such a huge amount of money just for bragging rights. The midrange is perfectly fine for playing any game these days. Those top end GPUs are getting an absolutely inordinate amount of attention compared to the relevance they have to most people.

      • warm@kbin.earth
        link
        fedilink
        arrow-up
        4
        ·
        19 hours ago

        Another problem is how big games are made now, they are made (relatively) quickly and are very underperformant. So while GPUs 2, 3 generations ago could be running beautiful games at beautiful framerates, instead they run like ass. Nvidia wants them to rely on their DLSS shit to give people a reason to keep buying their GPUs every cycle. So people feel like they need to upgrade, when they really don’t, instead they should stop buying these poorly made games.

      • filister@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        1 day ago

        The problem is that NVIDIA is consistently gimping the mid range making it a very unattractive proposition.

      • poleslav@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 day ago

        As someone who does VR in flight sims on one of the least optimized games (DCS) I can see the allure. Aside from that one niche though, I can’t think of many uses for a 90 series card though

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        Yup, my 6650 XT is perfectly fine, and my SO has a 6700 XT. Both are way more than we need, and we paid $200-300 for them on sale. Why get the top end? Mine is roughly equivalent to current consoles, so I doubt I’m missing out on much except RTX, but I also don’t care enough about RTX to 10x my GPU cost.

    • latenightnoir@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 day ago

      This was my exact thinking the moment I realised I, yet again, needed a GPU upgrade (thanks, Unreal 5…). Which is why I seared my soul and dished for a 4080 Super, with the hopes that I’ll be covered for a decade at least. The 40s at least seem to still be built mainly for pretty pictures.

      Genuinely not worth paying attention to this nonsense. Maybe - MAYBE - AMD will pull a Comrade and will shift full focus on creating genuinely good and progressively better GPUs, meant for friggin’ graphics processing and not this “AI” tumor. But that’s a big-ass “maybe.”

    • halcyoncmdr@lemmy.world
      link
      fedilink
      English
      arrow-up
      39
      ·
      1 day ago

      Nvidia doesn’t give a shit about gamers anymore. The incremental improvements are a side effect. This is why they’re so focused on software enhancements instead like DLSS now. It gives them the marketing numbers without having to do the hardware improvements for gaming.

      Their bread and butter now is AI, and large scale machine learning. Where businesses are buying thousands of cards at a time. It’s also why they’re so stingy with VRAM on their cards, large amounts on VRAM are not as necessary for most workloads outside gaming now, and it saves them millions of dollars every generation.

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        1 day ago

        You’re right, however I’d say that Nvidia has always been stingy with VRAM. The 3060 had 6GB while the RX 480 had 8GB, for example, the 970 had 3.5GB VRAM and the R9 390 had 8GB, and there are similar examples going back a long way.

        It has got pretty bad recently. Worse than normal. AI is also very VRAM intensive (even moreso than gaming), so I imagine they’ve been diverting those chips to their AI/enterprise cards.

        • ReallyActuallyFrankenstein@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 hours ago

          Well, Nvidia seemingly forgot to price gouge on RAM for the 3060 and they had a 12 GB standard version for a while. That should have been the low range standard, with 24 for mid and 32 for high, but they’ve adjusted.

          • TheGrandNagus@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            21 minutes ago

            They only did that because they were forced by AMD’s VRAM choices and unexpectedly great RDNA2 architecture.

            Because of the memory bus that the 3060 had, it essentially had to have either 6GB of VRAM or 12GB, and it’d have looked stupid next to AMD with only 6GB, so they changed it to 12GB fairly late on in development.

            It led to the bizarre situation of the 3060 Ti (based on the 3070 die) having less VRAM at 8GB.

            So yeah, less that they didn’t want to price gouge, more that AMD was giving 12GB for similarly priced cards that were also much faster, and Nvidia knew that 6GB would look like a joke in comparison.

      • TheDemonBuer@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        2
        ·
        1 day ago

        Nvidia doesn’t give a shit about gamers anymore…Their bread and butter now is AI, and large scale machine learning. Where businesses are buying thousands of cards at a time.

        I’m just quoting this for emphasis.