• NuXCOM_90Percent@lemmy.zip
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    The issue is that nVidia are increasingly marketing their consumer grade GPUs to “prosumer” users. Whether that is small research groups working with “AI” or people farming the latest memecoin or… the other things you would need REALLY REALLY high bandwidth linear algebra from and let’s move on.

    Whereas AMD are actually still targeting that consumer market. I think it was the nvidia 40x generation where their consumer cards had like no memory at all and AMD were pumping out 16 GB on their cheap(-ish) models? My brain can only remember card generations while I am actively shopping and… yeah.

    And yeah. I would LOVE an AMD card with 32 or even 64 GB of even slower memory. But games are still going to target nvidia because people keep buying it and that means that you just won’t have much use beyond the 8 (or apparently now 16) GB that nVidia are going to let you buy. At which point… why waste money?

    As for the prosumer and enterprise space? nVidia… have a long history of being assholes and previous GN videos have talked about the behind the scenes pressure they allegedly apply to system integrators and the like. And I will leave that there for Reasons.

    But yes, many mid-tier and even high-tier companies could benefit from just buying AMD cards and there is very much a market for “high end” AMD cards… it is just that they have so few customers to make it worthwhile.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      2 days ago

      I dunno what your talking, but all AMD has to is this:

      • Pick up the phone.

      • Tell their OEMs VRAM restrictions are lifted.

      • Put it down.

      …That’s it.

      They’d make seperate SKUs with double the VRAM. AMD doesn’t have to waste a cent.

      • NuXCOM_90Percent@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        That… really isn’t how things work at all.

        But also? That extra VRAM costs money (especially if you want it to be high performance). And you more or less need to produce things in bulk for it to be viable. So if AMD makes a bunch of “AI Accelerators” and nobody buys them because they would rather nVidia (which the video talked about)? it is just a massive flop AND it means that AMD is no longer “the best bang for your buck” option and is directly competing with nVidia in the mindspace of consumers.

        That said? I could actually see them cannibalize what little market share Intel got. The Intel GPUs are… moving on. But they have support for codecs that video editors and transcoders REALLY benefit from and a not insignificant part of the Influencer and Editor space actually have those in their editing or capture PCs. Tweaking the silicon to better support those use cases and selling higher memory versions of the Radeons would potentially be a “productivity” space taht can justify the added cost and have knock ons from people who just want to have even more chrome tabs open while they play fortnite. And… it might lead to the more CS side of the ML world actually realizing it isn’t that hard to run pytorch with an AMD card.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 days ago

          Yes, it is:

          https://www.amd.com/en/products/graphics/workstations/radeon-pro/w7900.html

          https://dramexchange.com/

          16gb GDDR6 ICs are averaging $10 each. The clamshell PCB is already made. So the cost of doubling up the VRAM in a clamshell configuration 7900 XTX (like the W7900) is like $100 at most, on top of this being a seperate memory supply from HMB the datacenter accelerators use. But AMD literally tells its OEMs they are not allowed to sell such clamshell configs of their cards, like they have in the past.

          The ostensible business reason is to justify the actual ‘workstation’ cards, which are laughing stocks in that space at those prices.

          Hence, AMD is left scratching their heads wondering why no one is developing for the MI325X when devs have literally zero incentive to buy AMD cards to test on.

          So if AMD makes a bunch of “AI Accelerators” and nobody buys them because they would rather nVidia (which the video talked about)?

          Well, seeing how backordered the Strix Halo Framework Desktop is (even with its relatively mediocre performance), I think this isn’t a big concern.

          There is a huge market dying to get out from under Nvidia here. AMD is barely starting to address this with a 32GB model of the 9000 series, but it’s too little too late. That’s not really worth the trouble over a 4090 or 5090, but that calcus changes if the config could be 48GB on a single card like the 7900.

          • NuXCOM_90Percent@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            Yes… for an individual, those are the prices (if only there was some 3 hour youtube video about adding more memory to cards…).

            The issue is that even a downstream isn’t buying 100 dollars of VRAM. They need to buy that in bulk. And then they need to retool their factories to support that configuration. And if they can’t sell enough of those units to justify the retooling and the purchases?

            I mean… look at EVGA

            And then you have the marketing/brand implications which I already spoke to.

            • brucethemoose@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 day ago

              That’s what I’m saying, there is no retooling. Some of AMD’s existing OEMs are already making W7900s.

              Here’s the bulk of the process on the OEM side, other than maybe leaving an ECC chip off:

              • Take finished W7900.

              • Change ID in firmware (so the CAD drivers don’t recognize it)

              • Apply a different sticker, put it in a different box

              • Do the paperwork of making a new SKU, like they make for overclocked cards

              That’s not that expensive. If it doesn’t sell a lot, well, not much skin off thier back. And it would make AMD boatloads by seeding development for their server cards (which the workstation cards to not do because they are utterly pointless at those prices).

              This is all kind of a moot point though, as the 7900 series is basically sunsetted, and AMD doesn’t have a 384 bit consumer card anymore (nor a GDDR7 one to use the new, huge GDDR7 ICs).