• just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    92
    arrow-down
    6
    ·
    2 days ago

    My mind is still blown on why people are so interested in spending 2x the cost of the entire machine they are playing on AND a hefty power utility bill to run these awful products from Nvidia. Generational improvements are minor on the performance side, and fucking AWFUL on the product and efficiency side. You’d think people would have learned their lessons a decade ago.

    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      59
      arrow-down
      6
      ·
      2 days ago

      they pay because AMD (or any other for that matter) has no product to compete with a 5080 or 5090

      • Naz@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        1 day ago

        I have overclocked my AMD 7900XTX as far as it will go on air alone.

        Undervolted every step on the frequency curve, cranked up the power, 100% fan duty cycles.

        At it’s absolute best, it’s competitive or trades blows with the 4090D, and is 6% slower than the RTX 4090 Founder’s Edition (the slowest of the stock 4090 lineup).

        The fastest AMD card is equivalent to a 4080 Super, and the next gen hasn’t shown anything new.

        AMD needs a 5090-killer. Dual socket or whatever monstrosity which pulls 800W, but it needs to slap that greenbo with at least a 20-50% lead in frame rates across all titles, including raytraced. Then we’ll see some serious price cuts and competition.

      • just_another_person@lemmy.world
        link
        fedilink
        English
        arrow-up
        41
        arrow-down
        8
        ·
        2 days ago

        Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that’s not where their product line intends to go. That’s why it’s smart.

        For reference: AMD has the most deployed GPUs on the planet as of right now. There’s a reason why it’s in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn’t just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.

        • Eager Eagle@lemmy.world
          link
          fedilink
          English
          arrow-up
          26
          arrow-down
          2
          ·
          2 days ago

          this openai partnership really stands out, because the server world is dominated by nvidia, even more than in consumer cards.

          • SheeEttin@lemmy.zip
            link
            fedilink
            English
            arrow-up
            18
            ·
            2 days ago

            Yup. You want a server? Dell just plain doesn’t offer anything but Nvidia cards. You want to build your own? The GPGPU stuff like zluda is brand new and not really supported by anyone. You want to participate in the development community, you buy Nvidia and use CUDA.

            • qupada@fedia.io
              link
              fedilink
              arrow-up
              19
              ·
              2 days ago

              Fortunately, even that tide is shifting.

              I’ve been talking to Dell about it recently, they’ve just announced new servers (releasing later this year) which can have either Nvidia’s B300 or AMD’s MI355x GPUs. Available in a hilarious 19" 10RU air-cooled form factor (XE9685), or ORv3 3OU water-cooled (XE9685L).

              It’s the first time they’ve offered a system using both CPU and GPU from AMD - previously they had some Intel CPU / AMD GPU options, and AMD CPU / Nvidia GPU, but never before AMD / AMD.

              With AMD promising release day support for PyTorch and other popular programming libraries, we’re also part-way there on software. I’m not going to pretend like needing CUDA isn’t still a massive hump in the road, but “everyone uses CUDA” <-> “everyone needs CUDA” is one hell of a chicken-and-egg problem which isn’t getting solved overnight.

              Realistically facing that kind of uphill battle, AMD is just going to have to compete on price - they’re quoting 40% performance/dollar improvement over Nvidia for these upcoming GPUs, so perhaps they are - and trying to win hearts and minds with rock-solid driver/software support so people who do have the option (ie in-house code, not 3rd-party software) look to write it with not-CUDA.

              To note, this is the 3rd generation of the MI3xx series (MI300, MI325, now MI350/355). I think it might be the first one to make the market splash that AMD has been hoping for.

              • felsiq@lemmy.zip
                link
                fedilink
                English
                arrow-up
                4
                ·
                1 day ago

                AMD’s also apparently unifying their server and consumer gpu departments for RDNA5/UDNA iirc, which I’m really hoping helps with this too

              • SheeEttin@lemmy.zip
                link
                fedilink
                English
                arrow-up
                3
                ·
                edit-2
                1 day ago

                I know Dell has been doing a lot of AMD CPUs recently, and those have definitely been beating Intel, so hopefully this continues. But I’ll believe it when I see it. Often, these things rarely pan out in terms of price/performance and support.

            • Eager Eagle@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              2 days ago

              yeah, I helped raise hw requirements for two servers recently, an alternative to nvidia wasn’t even on the table

          • just_another_person@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            4
            ·
            2 days ago

            Actually…not true. Nvidia recently became bigger in the DC because of their terrible inference cards being bought up, but AMD overtook Intel on chips with all major cloud platforms last year, and their Xilinix chips are slowly overtaking the sales of regular CPUs for special purposes processing. By the end of this year, I bet AMD will be the most deployed brand in datacenters globally. FPGA is the only path forward in the architecture world at this point for speed and efficiency in single-purpose processing. Nvidia doesn’t have a competing product.

        • ctrl_alt_esc@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          4
          ·
          1 day ago

          Unfortunately, this partnership with OpenAI means they’ve sided with evil and I won’t spend a cent on their products anymore.

            • ctrl_alt_esc@lemmy.ml
              link
              fedilink
              English
              arrow-up
              2
              ·
              14 hours ago

              Oh so you support grifting off the public domain? Maybe grow some balls instead of taking the status quo for granted.

      • Cyberwolf@feddit.org
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        edit-2
        23 hours ago

        What do you even need those graphics cards for?

        Even the best games don’t require those and if they did, I wouldn’t be interested in them, especially if it’s an online game.

        Probably only a couple people would be playing said game with me.

    • RazgrizOne@piefed.zip
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      3
      ·
      2 days ago

      Once the 9070 dropped all arguments for Nvidia stopped being worthy of consideration outside of very niche/fringe needs.

      • CheeseNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        1 day ago

        Got my 9070XT at retail (well retail + VAT but thats retail for my country) and my entire PC costs less than a 5090.

        • RazgrizOne@piefed.zip
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          1 day ago

          Yeah I got a 9070 + 9800x3d for around $1100 all-in. Couldn’t be happier with the performance. Expedition 33 running max settings at 3440x1440 and 80-90fps

          • FreedomAdvocate@lemmy.net.au
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            15 hours ago

            But your performance isn’t even close to that of a 5090…….

            80-90 fps @ 1440 isn’t great. That’s like last gen mid tier nvidia gpu performance.

            • RazgrizOne@piefed.zip
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              13 hours ago

              Not 1440 like you’re thinking. 3440x1440 is 20% more pixel to render than standard 2560x1440’s. It’s a WS. And yes at max settings 80-90fps is pretty damn good. It regularly goes over 100 in less busy environments.

              And yeah it’s not matching a 5090, a graphics card that costs more than 3x mine and sure as hell isn’t giving 3x the performance.

              You’re moving the goalposts. My point is for 1/4th the cost you’re getting 60-80% of the performance of overpriced, massive, power hungry Nvidia cards (depending on what model you want to compare to). Bang for buck, AMD smokes Nvidia. It’s not even close.

              Unless cost isn’t a barrier to you or you have very specific needs they make no sense to buy. If you’ve got disposable income for days then fuck it buy away.

              • FreedomAdvocate@lemmy.net.au
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 hour ago

                I assume people mean 3440x1440 when they say 1440 as it’s way more common than 2560x1440.

                Your card is comparable to a 5070, which is basically the same price as yours. There’s no doubt the 5080 and 5090 are disappointing in their performance compared to these mid-high cards, but your card can’t compete with them and nvidia offer a comparable card at the same price point as AMDs best card.

                Also the AMD card uses more power than the nvidia equivalent (9700xt vs 5070).

                • RazgrizOne@piefed.zip
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  51 minutes ago

                  I assume people mean 3440x1440 when they say 1440 as it’s way more common than 2560x1440.

                  Most people do not use WS as evidenced by the mixed bag support it gets. 1440 monitors are by default understood to be 2560x1440p as it’s 16:9 which is still considered the “default” by the vast majority of businesses and people alike. You may operate as if most people using 1440+ are on WS but that’s a very atypical assumption.

                  Raytracing sure but otherwise the 4090 is actually better than the 5070 in many respects. So you’re paying a comparable price for Raytracing and windows dependency, which if that is important to you then go right ahead. Ultimately though my point is that there is no point in buying the insanely overpriced Nvidia offerings when you have excellent AMD offerings for a fraction of the price that don’t have all sorts of little pitfalls/compromises. The Nvidia headaches are worth it for performance, which unless you 3-4x your investment you’re not getting more of. So the 5070 is moot.

                  I’m not sure what you’re comparing at the end unless you meant a 9070XT which I don’t use/have and wasn’t comparing.

                • RazgrizOne@piefed.zip
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  13 hours ago

                  Not all of us can afford to spend $3000 for a noticeable but still not massive performance bump over a $700 option. I don’t really understand how this is so difficult to understand lol. You also have to increase the rest of your machine cost for things like your PSU, because the draw on the 5xxx series is cracked out. Motherboard, CPU, all of that has to be cranked up unless you want bottlenecks. Don’t forget your high end 165hz monitor unless you want to waste frames/colors. And are we really going to pretend after 100fps the difference is that big of a deal?

                  Going Nvidia also means unless you want to be fighting your machine all the time, you need to keep a Windows partition on your computer. Have fun with that.

                  At the end of the day buy what you want dude, but I’m pulling down what I said above on a machine that cost about $1700. Do with that what you will

                  • Glog78@digitalcourage.social
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    13 hours ago

                    @RazgrizOne @FreedomAdvocate the reason why i decided for AMD after being nearly all my life team green ( aka >20 years ) , i feel like AI Frame Generation and Upscalling are anti consumer cause the hide the real performance behind none reproducable image generation. And if you look correctly … this is how nvidia has a performance lead over AMD.

    • Static_Rocket@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 days ago

      Well, to be fair the 10 series was actually an impressive improvement to what was available. Since then I switched to AMD for better SW support. I know since then the improvements have dwindled.

      • just_another_person@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        2 days ago

        AMD is at least running the smart game on their hardware releases with generational leaps instead of just jacking up power requirements and clock speeds as Nvidia does. Hell, even Nvidia’s latest lines of Jetson are just recooked versions from years ago.

      • yeehaw@lemmy.ca
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        1 day ago

        The best part is, for me, ray tracing looks great. When I’m standing there and slowly looking around.

        When I’m running and gunning and shits exploding, I don’t think the human eye is even capable of comprehending the difference between raster and ray tracing at that point.

        • FreedomAdvocate@lemmy.net.au
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          15 hours ago

          It absolutely is, because Ray tracing isn’t just about how precise or good the reflections/shadows look, it’s also about reflecting/getting shadows from things that are outside of your field of view. That’s the biggest difference.

          One of the first “holy shit!” moments for me was playing doom I think it was, and walking down a corridor and being able to see that there were enemies around the corner by seeing their reflection on the opposite wall. That’s never been possible before, and it’s only possible thanks to raytracing. Same with being able to see shadows from enemies that are behind you out of screen to the side.

        • Chozo@fedia.io
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 day ago

          Yeah, that’s what’s always bothered me about the drive for the highest-fidelity graphics possible. In motion, those details are only visible for a frame or two in most cases.

          For instance, some of the PC mods I’ve seen for Cyberpunk 2077 look absolutely gorgeous… in screenshots. But once you get into a car and start driving or get into combat, it looks nearly indistinguishable from what I see playing the vanilla game on my PS5.

    • tormeh@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      11
      ·
      2 days ago

      If you’re on Windows it’s hard to recommend anything else. Nvidia has DLSS supported in basically every game. For recent games there’s the new transformer DLSS. Add to that ray reconstruction, superior ray tracing, and a steady stream of new features. That’s the state of the art, and if you want it you gotta pay Nvidia. AMD is about 4 years behind Nvidia in terms of features. Intel is not much better. The people who really care about advancements in graphics and derive joy from that are all going to buy Nvidia because there’s no competition.

      • just_another_person@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        7
        ·
        2 days ago

        First, DLSS is supported on Linux.

        Second, DLSS is kinda bullshit. The article goes into details that are fairly accurate.

        Lastly, AMD is at parity with Nvidia with features. You can see my other comments, but AMD’s goal isn’t selling cards for gamers. Especially ones that require an entire dedicated PSU to power them.

        • FreedomAdvocate@lemmy.net.au
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          edit-2
          15 hours ago

          Nvidia cards don’t require their own dedicated PSU, what on earth are you talking about?

          Also DLSS is not “kinda bullshit”. It’s one of the single biggest innovations in the gaming industry in the last 20 years.

          • just_another_person@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            4
            ·
            2 days ago

            No. AMD. See my other comments in this thread. Though they are in every major gaming console, the bulk of AMD sales are aimed at the datacenter.