• Logical@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 day ago

    Glad that I recently bought a bunch of storage so that I’ll be covered for a good amount of time.

  • Suavevillain@lemmy.world
    link
    fedilink
    English
    arrow-up
    103
    arrow-down
    3
    ·
    2 days ago

    AI has taken more things since it’s big push to be adopted in the public sector.

    Clean Air

    Water

    Fair electricity bills

    Ram

    GPUs

    SSDs

    Jobs

    Other people’s art and writing.

    There are no benefit to this stuff. It is just grifting.

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    2 days ago

    AFAIK this has already been a problem, you can find Samsung M.2 SSDs for cheaper than Samsung SATA SSDs at the same capacity, because their cloud customers have all flown past classic SATA/SAS for NVME U.2 and U.3, which is much more similar to M.2 due to NVME.

    I was planning on adding a big SSD array to my server which has a bunch of external 2.5 SAS slots, but it ended up being cheaper and faster to buy a 4 slot M.2 PCIe card and buy 4 M.2 drives instead.

    Putting it on a x16 PCIe slot gives me 4 lanes per drive with bifurication, which gets me the advertised maximum possible speed on PCIe 4.

    Whether or not the RAM surge will affect chip production capacity is the real issue. It seems all 3 OEMs could effectively reduce capacity for all other components after slugging billions of dollars into HBM RAM. It wouldn’t just be SSDs, anything that relies on the same supply chain could be heavily affected.

    • iglou@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      23 hours ago

      Exactly this. Micron ended their consumer RAM. Sansung here is just stopping producing something that is arguably outdated, and has a perfectly fine, already more available, most often cheaper or equivalent modern replacement.

    • HappySkullsplitter@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      23 hours ago

      I wonder what changed, prices were being driven down on SSDs for a while there

      Put a 1tb 850 Evo in our PS4 years ago for a pretty reasonable price. Kind of expected prices to continue to fall back then

  • Bluewing@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    24 hours ago

    I got an old Nitro 5 with a rickity old 500gig hard drive. Will a Crucial BX500 1TB 3D NAND SATA 2.5-Inch Internal SSD be a good Christmas present for it?

    Probably should get something while prices are somewhat more reasonable.

  • Randelung@lemmy.world
    link
    fedilink
    English
    arrow-up
    115
    arrow-down
    1
    ·
    2 days ago

    This bubble is going to become the entire market, isn’t it. Until it becomes too big to fail because 80% of the workforce is tied up in it. Then it is allowed to pop, costing the western world everything, all going into the pockets of the super rich, and we get to start over.

      • Khrux@ttrpg.network
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        17
        ·
        2 days ago

        Compared to crypto and NFTs, there is at least something in this mix, not that I could identify it.

        I’ve become increasingly comfortable with LLM usage, to the point that myself from last year would hate me. Compared to projects I used to do with where I’d be deep into Google Reddit and Wikipedia, ChatGPT gives me pretty good answers much more quickly, and far more tailored to my needs.

        I’m getting into home labs, and currently everything I have runs on ass old laptops and phones, but I do daydream if the day where I can run an ethically and sustainably trained, LLM myself that compares to current GPT-5 because as much as I hate to say it, it’s really useful to my life to have a sometimes incorrect but overalls knowledgeable voice that’s perpetually ready to support me.

        The irony is that I’ll never build a server that can run a local LLM due to the price hikes caused by the technology in the first place.

        • raspberriesareyummy@lemmy.world
          link
          fedilink
          English
          arrow-up
          39
          arrow-down
          6
          ·
          2 days ago

          I’ve become increasingly comfortable with LLM usage, to the point that myself from last year would hate me. Compared to projects I used to do with where I’d be deep into Google Reddit and Wikipedia, ChatGPT gives me pretty good answers much more quickly, and far more tailored to my needs.

          Please hate yourself, reflect on that and walk back from contributing to destroying the environment by furthering widespread adoption of this shitty technology. The only reason you seem to get “useful answers” is because of search engine and website enshittification. What you are getting is still tons worse than a good web research 10 years ago.

          Basically you were taught to enjoy rancid butter because all restaurants around you had started tasting like shit first, then someone opened a rancid butter shop.

          • Khrux@ttrpg.network
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            7
            ·
            2 days ago

            I do agree entirely. If I could use the internet of 2015 I would, but I can’t do so in a practical way that isn’t much more tedious than asking an LLM.

            My options are the least rancid butter of the rancid butter restaurants or I churn my own. I’d love to churn my own and daydream of it, but I am busy, and can barely manage to die on every other hill I’ve chosen.

            • dil@lemmy.zip
              link
              fedilink
              English
              arrow-up
              6
              ·
              2 days ago

              web search isnt magically going back to how it was, and its not just search engines its every mf trying tk take advantage of seo and push their content to the top, search is going to get worse evry year, ai did speed it up by making a bunch of ai images pop up whenever you search an image

            • raspberriesareyummy@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              2 days ago

              problem is that the widespread use of (and thereby provision of your data to) LLMs contributes to the rise of totalitarian regimes, wage-slavery and destroying our planet’s ecosystem. Not a single problem in any of our lives is important enough to justify this. And convenience because we are too lazy to think for ourselves, or to do some longer (more effort) web research, is definitely not a good excuse to be complicit in murder, torture and ecoterrorism.

              • Koarnine@pawb.social
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 day ago

                I agree except for the fact that it’s unavoidable

                It’s horrific, but its inescapable, the problem is not going away and while you’re refusing to use LLMs to accelerate your progress, the opposition isn’t

                Don’t get me wrong, anyone who blindly believes sycophantic LLM garbage is a fool.

                Its taken 4 years to overcome my llm moral ocd - and its only because I need to start working, in a world where every company forces AI down your throat, there are many who simply have no choice if they want to compete

                Also I’m kinda glad I can spend more of my useful energy working towards my goals rather than battling the exact minutiae without any sort of guide

                • raspberriesareyummy@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 day ago

                  The thing is: LLMs do not accelerate the progress of proper software development. Your processes have to be truly broken to be able to experience a net gain from using LLMs. It enables shitty coders to output pull requests that look like they were written by someone competent, and thereby effectively waste the time of skilled developers who review such pull requests out of respect for the contributor, only to find out it is utter garbage.

    • Khrux@ttrpg.network
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      1
      ·
      2 days ago

      I heard a theory (that I don’t believe, but still) that Deepseek is only competitive to lock the USA into a false AI race.

    • Ensign_Crab@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      2 days ago

      Then it is allowed to pop, costing the western world everything, all going into the pockets of the super rich, and we get to start over.

      After the bailouts at the expense of the poor, of course.

    • humanspiral@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      it becomes too big to fail because 80% of the workforce is tied up in it

      In 2008, banking sector and auto industry needed bailouts for the investor/financial class. Certainly, there was no need to layoff core banking employees, if government support was the last resort to keep the doors open AND gain controlling stake over future banking profitablity in a hopefully sustainable (low risk in addition to low climate/global destruction) fashion. The auto bailout did have harsher terms than the banking bailout, and recessions definitely harm the sector, but the bailouts were definitely focused on the executives/shareholders who have access to political friendships that result in gifts instead of truly needed lifelines, or wider redistribution of benefits from sustainable business.

      The point, is that workforce is a “talking point” with no actual relevance in bailouts/too big to fail. That entire stock market wealth is concentrated in the sector, and that we have to all give them the rest of our money (and militarist backed surveillance freedom) or “China will win” at the only sector we pretend to have a competitive chance in, is why our establishment needs another “too big to fail moment”. We’ve started QE ahead of the crash this time.

      Work force is relatively small in AI sector. Big construction, but relatively low operations employment. It displaces other hiring too.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    2 days ago

    Aside: WTF are they using SSDs for?

    LLM inference in the cloud is basically only done in VRAM. Rarely stale K/V cache is cached in RAM, but new attention architectures should minimize that. Large scale training, contrary to popular belief, is a pretty rare event most data centers and businesses are incapable of.

    …So what do they do with so much flash storage!? Is it literally just FOMO server buying?

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      2 days ago

      Storage. There aren’t enough hard drives, so datacentres are also buying up SSDs, since it’s needed to store training data.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        6
        ·
        2 days ago

        since it’s needed to store training data.

        Again, I don’t buy this. The training data isn’t actually that big, nor is training done on such a huge scale so frequently.

        • finitebanjo@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          1
          ·
          edit-2
          2 days ago

          As we approach the theoretical error rate limit for LLMs, as proven in the 2020 research paper by OpenAI and corrected by the 2022 paper by Deepmind, the required training and power costs rise to infinity.

          In addition to that, the companies might have many different nearly identical datasets to try to achieve different outcomes.

          Things like books and wikipedia pages aren’t that bad, wikipedia itself compressed is only 25GB, maybe a few hundred petabytes could store most of these items, but images and videos are also valid training data and that’s much larger, and then there is readable code. On top of that, all user inputs have to be stored to reference them again later if the chatbot offers that service.

    • Urga@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      The lines used to produce vram also do ssd nand flash, so they make less ssds to make more vram

  • Kyden Fumofly@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    2 days ago

    The leak comes after another report detailed that Samsung has raised DDR5 memory prices by up to 60%.

    MF… And why they wind down SSD production this time? Last time was 2 years ago, because the SSD prices were low and they wanted to raise them (which happened).

        • GreenKnight23@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          we all know as soon as big bad chip daddy comes back with a big discount everyone not in this thread (and even some that are) will spread their cheeks and beg for more.

          humans are dumb greedy little assholes that have zero willpower. that’s why it’s so easy to manipulate us.

  • lechekaflan@lemmy.world
    link
    fedilink
    English
    arrow-up
    76
    arrow-down
    1
    ·
    edit-2
    2 days ago

    Yet another chapter in the fucking AI craze started up by them fucking techbros.

    Also, someone forgot that in some places in the world, people have to use older PCs with SATA drives. That, until their discontinuation announcements, Crucial and Samsung SATA drives were several tiers better than, say, those cheapo Ramsta drives.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      37
      ·
      2 days ago

      Discontinuing outdated tech has nothing to do with AI. SATA SSDs need to be retired. NVME is superior and widely available.

      • The_Decryptor@aussie.zone
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 days ago

        Especially since you can get M.2 to SATA adapters, so people stuck with SATA only motherboards can still upgrade their storage.

        Literally the same deal when companies stopped making IDE drives, people just used SATA to IDE adapters instead.

        • FrederikNJS@lemmy.zip
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 days ago

          Do you know of any m.2 to SATA adapters that support NVMe? Or are these only for Sata M.2s?

          • The_Decryptor@aussie.zone
            link
            fedilink
            English
            arrow-up
            8
            ·
            2 days ago

            Man, it sure would be helpful for my argument if I could.

            I went back and checked the ones I was looking at, very helpful fine print stating “not for NVEM ssds”, so they all only work with mSATA M.2 SSDs, hell of a let down.

            • Amju Wolf@pawb.social
              link
              fedilink
              English
              arrow-up
              5
              ·
              2 days ago

              Despite how similar the interface is the protocol is completely different. NVMe is basically just PCIe, so adaptating it so that it runs “under” SATA would be difficult if not almost impossible. And most definitely not worth the extra price.

              • The_Decryptor@aussie.zone
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 day ago

                Well no you can translate, but it just seems that nobody has actually made a product to do so.

                e.g. those M.2 SSD to USB adapters, those aren’t speaking NVMe to the host device. They either talk the traditional USB “bulk transfer” protocol, or potentially SCSI, translating that to NVMe for the SSD itself.

  • EndlessNightmare@reddthat.com
    link
    fedilink
    English
    arrow-up
    39
    ·
    2 days ago

    Cries in PC gamer

    I’m glad I already have a good setup and shouldn’t be buying anything for a good while, but damn it. First the GPU, then RAM, now SSDs.

    • DFX4509B@lemmy.wtf
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      5
      ·
      2 days ago

      Next step, modular desktops as a concept will die, probably.

      I hope people like locked-down black boxes they can’t upgrade and can’t run their own OS on in the future, so byebye Linux and BSD in that scenario outside of niche devices.

      • Aceticon@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        At the same time, just expanding a device with new parts is a far cheaper way to get more performance than buying a new device - after all, whatever price problem there is with some kinds of parts, it will be the same whether they’re sold as lose parts or as part of a device.

        Poor working class young me in a poorer European country after getting his first PC quickly found out that to get a more powerful machine he had to start upgrading that machine because there wasn’t money to buy a whole new one every couple of years.

        My point is that this might very well yield the very opposite effect of what you describ: buying whole devices to replace older models becomes too expensive so people favor more expandable devices - because those can have their performance improved with just some new parts, which are cheaper than getting a whole new device - and the market just responds to that.

        I think most people in countries which until recently were wealthier, such as the US, are far too used to the mindset of “throw the old one out and but a new one” which is not at all the mindset of people in places were resources are constrained or require a lot bigger fraction of people’s income to buy (certainly my experience living in the UK after having grown up in a country which was much poorer left me with that impression: the Brits just felt incredibly wasteful to somebody who like me grew up in a situation were “gettting a new iPhone every 2 years” was the kind of think only a rich person or a stupid person would do).

        • humanspiral@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          whatever price problem there is with some kinds of parts, it will be the same whether they’re sold as lose parts or as part of a device.

          didn’t actually read the article, but the Micron/Crucial announcement was about leaving the DIY direct market, as opposed to not keeping supply deals with OEMs. Though new contracts with them will be higher.

      • BilSabab@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        what’s baffling is that modular desktops are probably a better long-term money making strategy for hardware makers. When you can cycle gear with ease - the temptation to try something new will be bigger.

      • lordbritishbusiness@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        2 days ago

        The AI builders must be buying all the fab time and components to go to the build outs.

        Desktops will go first and fade as the entire production chain stops.

        Notebooks will be next, at least PC parts have a premium price, notebooks are too cheap to avoid it for long. Game consoles will face the same pressure.

        The supply shock is going to be as bad as COVID.

        • DFX4509B@lemmy.wtf
          link
          fedilink
          English
          arrow-up
          11
          ·
          2 days ago

          The supply shock is going to be as bad as COVID.

          • No, it’ll be worse, it’ll be straight-up apocalyptic. GenAI grifters are trying to cause an apocalypse.
        • DFX4509B@lemmy.wtf
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          2 days ago

          Eventual discontinuation of more PC parts to appease the AI grifters until all that’s left for consumers is mini PCs or ARM black boxes.

          • ManOMorphos@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            Chinese companies are most likely going to fill at least some of the void that the other companies will be leaving to chase AI hype. It won’t necessarily be cheap though.

          • Pycorax@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 days ago

            But that hasn’t happened at all and there’s no evidence of that happening? To my knowledge at least. If you have some I’d love to see it.

      • Korhaka@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        You could just run Linux on a 2009 thinkpad. Oh no, I will have to buy even cheaper machines to run Linux.

    • WorldsDumbestMan@lemmy.today
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      2 days ago

      I ordered an S10 tab, paid my first rate, they finally try to order it, inform me it’s gone from the page, and try to get me to pay MORE for a weaker device.

      I refuse, ask for a refund, and that is how I got screwed over last moment from owning something I need, just before the crash.

      • Kazumara@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        I ordered an S10 tab, paid my first rate, they finally try to order it

        Who is “they” in this? Some sort of intermediary you were using?

  • etchinghillside@reddthat.com
    link
    fedilink
    English
    arrow-up
    135
    arrow-down
    5
    ·
    3 days ago

    My mind forgot that M.2 is probably more prevalent these days and that they’re not just shutting down for no reason.

    • Hubi@feddit.org
      link
      fedilink
      English
      arrow-up
      52
      ·
      3 days ago

      Is it though? Pretty much every single current-gen mainboard still comes with a number of SATA ports.

      • RamRabbit@lemmy.world
        link
        fedilink
        English
        arrow-up
        54
        ·
        3 days ago

        Everyone is going to buy M.2 SSDs first, and only buy SATA if they don’t have enough M.2 slots. I really doubt SATA SSDs are selling well.

        With that said, I don’t see SATA going anywhere. It’s (comparatively low) bandwidth means you can throw a few ports on your board and not sacrifice much. For some quick math: a M.2 port back-hauled by PCIe 4.0 x4 has 7.8 GB/s of data lines going to it. While SATA 6.0 has only 0.75 GB/s of data lines going to it.

        • tburkhol@lemmy.world
          link
          fedilink
          English
          arrow-up
          37
          ·
          3 days ago

          SATA is really convenient for larger storage, though. I keep my OS on nvmes, but I’ve got a couple of SATA drive and a hot swap bay for games, media, etc.

          • clif@lemmy.world
            link
            fedilink
            English
            arrow-up
            28
            ·
            3 days ago

            I’m still running SATA spinny disks for my big-ish data. I can’t afford a 16TB SSD…

            I know that’s off topic, but HDDs are still a thing too.

            • RamRabbit@lemmy.world
              link
              fedilink
              English
              arrow-up
              13
              ·
              3 days ago

              I’m very excited for the day I can replace my spinners with SSDs. That day is coming, but it is not today.

            • Valmond@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              2 days ago

              They have become expensive too IMO, a 3-4 TB drive costs more today than a couple of years ago, and the used market here in europe is insane.

        • AlfredoJohn@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          And how many motherboards have the same amount of m.2 slots as they do sata slots? And what generation? So now I need new ram which is inflated to high hell, a new motherboard and cpu to increase storage on my gaming rig? Its not like games are small these days I like to keep most games i have installed and that takes multiple terabytes of storage that is cheaper to do via sata ssds… this is clearly anti consumer and done purely to push people to newer systems in the hope people stay with windows instead of swapping to linux. Its being done to keep the ai bubble going…

        • DFX4509B@lemmy.wtf
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          Even then, NVMe riser cards are a thing to just stick an NVMe drive in a spare PCIe slot.

          • Trainguyrom@reddthat.com
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 days ago

            Does require you to have the PCIe lanes for it, BIOS support for booting to PCIe (which Intel 6th gen core CPUs were the first to support. 4th gen never did but some had m.2 slots and NVMe support for secondary drives and the 5th gen X99s had some receive BIOS updates to support but that’s its own can of worms) and both Intel and AMD have historically been pretty bad about being stingy about PCIe lane availability

            Plus to run more than a single NVMe on a single slot your motherboard either needs to support PCIe bifurcation which is almost exclusively an enterprise feature or they need to have the right lane configuration available to support that x16 slot handing out 4x4 lanes (or 2x8/2x4 for dual NVMe)

            • DFX4509B@lemmy.wtf
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 days ago

              both Intel and AMD have historically been pretty bad about being stingy about PCIe lane availability

              • Hold up, I thought some of the nicer AM3+ boards using the 990FX chipset had a fair bit of lanes available both for their time and even now still. Like, the best 990FX boards on AM3+ had more expansion than the X370/470/570 boards on AM4 or the best X670/X870 boards on AM5 last time I thought.
              • Trainguyrom@reddthat.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 days ago

                Y’know what, I honestly haven’t looked at what the PCIe lane layout is like on newer chipsets. Maybe it’s gotten better since I last really paid attention like 5+ years ago. I remember in early-mid AM4 there was a lot of grumbling about how there’s only 20 PCIe 3 lanes followed by early PCIe 4 platforms that would give only 16-20 lanes with another 8 or so PCI 3 lanes. I also didn’t really pay much attention to AMD before AM4 given how far behind Intel they were. But I could be entirely out of date now that I think about it

                • DFX4509B@lemmy.wtf
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  edit-2
                  2 days ago

                  Phoenix2 APUs like the R3 8300G and R5 8500G are the worst offenders in the ‘cutting PCIe lanes’ department.

                  The R5 8500G only has 14 lanes, for example. The FX-8350 and 8370 from a decade earlier, would’ve had 32 lanes available on the 990FX chipset, and half that on the 990X and 970 chipsets per contemporary reviews from when those CPUs were new, but they were all PCIe 2 as AM3+ was a PCIe 2 platform.

                  This is the specific review I’m going off of for this. FX-8350 review

                  Per that review, 990FX would’ve supported 2 x16 or 4 x8 slots, while 990X would’ve supported 2 x8 slots, and 970 would’ve only supported a single x16 slot, but of course configs varied by the board makers, and there would’ve been nothing stopping someone from making a 990FX board with a single x16 slot, three x4 slots, and two x2 slots, for example, nor a 990X board with a single x16 slot or a 970 board with a single x8 slot and two x4 slots.

        • Lfrith@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 days ago

          I have one m.2 and multiple sata ssd, since on my motherboard occupying the second m.2 slot would drop the pcie lane for my GPU due to sharing bandwidth.

          Do newer boards not have that problem?

          • Spaz@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            Higher spec boards dont have this issue; Typically an issue with low and mid range boards due to cost savings.

            • AlfredoJohn@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 days ago

              Which just also shows why this is a very anti consumer move. Its trying to artifically push people to by new hardware because there hasn’t been significant enough changes to really warrant it. This then means more people who might have swapped off of windows to keep their existing hardware might end up having to upgrade then stick with their familiar windows platform so that the ai bubble can continue. Its completely fucked up

      • A_Random_Idiot@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        ·
        edit-2
        3 days ago

        Yeah, but I think SATA is quickly being relegated to large mechanical storage drives. For things that don’t require performance, like storage and what have… because SATA is not getting any faster, I doubt anyones gonna come out with a SATA IV standard at this point, when PCIE over M2 is easier, simpler, and faster, and… outside of silicon shortage stupidities, getting cheaper and more affordable.

      • fuckwit_mcbumcrumble@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        3 days ago

        Comes with them, but only for legacy media. Outside of my NAS I haven’t bought a new sata drive in probably 10 years. And I haven’t touched my onboard sata ports in 5.

        The fact that they’re still there impresses me at this point. But their numbers are slowly dwindling. Sata is usually the first thing that gets dropped when you need more pcie lanes. And even then most boards only have 4 at this point. They’re switching back to those god awful vertical ports which tells you all you need to know about their priority.

      • SinningStromgald@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 days ago

        Most people at least put their OS on M2. I guess if you haven’t upgraded since M2 became common on motherboards you might not.

        Edit: I internet says M2 was common around 2016 2017 motherboards.

    • [deleted]@piefed.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 days ago

      I would be surprised if m2 has overtaken regular sata connections for the majority of computers produced for businesses and individuals, but maybe they don’t make enough in that area.

      • frongt@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        They definitely have. The smaller form factor is better for laptops, and if you can share parts between laptop and desktop it’s cheaper.