• 0 Posts
  • 76 Comments
Joined 1 month ago
cake
Cake day: April 10th, 2025

help-circle

  • So, I initially wanted to just kneejerk respond yes, it is absurd to suggest this ruling against Apple… would have any kind of generally noticable effect on inflation.

    But I wanted to check the actual numbers.

    Ok, so, total US consumer spending in 2024 is about $64 Trillion.

    … The Apple App Store generated $105 Billion in revenue in 2024.

    Ok, napkin math: 30% off of lets just say literally all App Store payments… , ok, we’ve cut costs by about $32 Billion… shave that off the $64 Trillion…

    And voila!

    A rough general price reduction of… 0.05%

    Call a median US yearly income $60K, and they’ve saved $30 bucks. Maybe the cost of either one or two DoorDash meals, depending on where you live… probably much closer to just one.

    We’re saved from inflation rofl!


  • This is where the magic of near meaningless corpo-babble comes in.

    The layoffs are part of a plan to aspirationally acheive the goal of $10b revenue by EoY 2025.

    What they are actually doing is a significant restructuring of the company, refocusing by outside hiring some amount of new people to lead or be a part of departments or positions that haven’t existed before, or are being refocused to other priorities…

    … But this process also involves laying off 500 of the ‘least productive’ or ‘least mission critical’ employees.

    So, technically, they can, and are, arguing that their new organizational paradigm will be so succesful that it actually will result in increased revenue, not just lower expenses.

    Generally corpos call this something like ‘right-sizing’ or ‘refocusing’ or something like that.

    But of course… anyone with any actual experience with working at a place that does this… will tell you roughly this is what happens:

    Turns out all those ‘grunts’ you let go of, well they actually do a lot more work in a bunch of weird, esoteric, bandaid solutions to keep everything going, than upper management was aware of… because middle management doesn’t acknowledge or often even understand that that work was being done, because they are generally self-aggrandizing narcissist petty tyrants who spend more time in meetings fluffing themselves up than actually doing any useful management.

    Then, also, you are now bringing on new, outside people who look great on paper, to lead new or modified departments… but they of course also do not have any institutional knowledge, as they are new.

    So now, you have a whole bunch of undocumented work that was being done, processes which were being followed… which is no longer being done, which is not documented… and the new guys, even if they have the best intentions, now have to spend a quarter or two or three figuring out just exactly how much pre-existing middle management has been bullshitting about, figuring out just how much things do not actually function as they ssid it did…

    So now your efficiency improving restructuring is actually a chaotic mess.

    … Now, this ‘right sizing’ is not always apocalyptically extremely bad, but it is also essentially never totally free from hiccups… and it increases stress, workload, and tensions between basically everyone at the company, to some extent…and decreased morale, increased stress basically always reduces efficiency, to some extent.

    Here’s Forbes explanation of this phenomenon, if you prefer an explanation of right sizing in corpospeak:

    https://www.forbes.com/advisor/business/rightsizing/


  • TFLOPs generally correlate to actual, general game performance quite well.

    A very demanding game will do poorly on a gpu with a given TFLOPs score, but a less demanding one may do fairly well.

    So, if you can roughly dial things into a TFLOPs value, you can get a result that is then generally applicable to all games.

    The most graphically demanding games require the most TFLOPs out of a GPU to render well and fast, but less demanding games can be fine with a lower TFLOPs GPU.

    Like… a GPU with a medium high TFLOPs score may only be able to push about 50 fps on CP77 ultra w/o RT, but it could likely easily hit something like 200ish FPS in say, CounterStike2, both games at 4k.

    https://www.resetera.com/threads/all-games-with-ps5-pro-enhancements.1026072/

    A PS5Pro can render CP77 at ‘quality mode’ …which is roughly the ‘high’ graphical quality preset of CP77 on PCs, with… I believe all raytracing other than raytraced shadows off…

    … at 2k30fps, or it can render at a dynamicly scaling resolution between 2k and 1080p at a locked 60fps.

    https://www.gpu-monkey.com/en/benchmark-amd_radeon_rx_6700-cyberpunk_2077

    An RX 6700 can render CP77 at 2k, 43 fps avg.

    This site isn’t 100% clear as to whether these GPU scores are done at the ‘high’ preset, or the ‘ultra’, all settings maxed out preset… but with all raytracing off.

    https://www.gpu-monkey.com/en/benchmark-amd_radeon_rx_6600-cyberpunk_2077

    A 6600 can do the same, high or ultra preset, with no raytracing, at 2k, with 31 fps.

    1080p at 51 fps.

    So uh yeah, even with the top of the line PS5 variant, the PS5Pro’s graphical capabilities are a bit worse than a 6700, and a bit better than a 6600.

    I have used many AMD cards, and yeah basically, if you only use raytraced local shadows?

    At 1080p, or 2k?

    Very roughly, you’ll get about the same fps with a ‘high’ preset, and only raytraced local shadows, as you would with no rt local shadows, but the game’s graphical preset bumped up to ‘ultra’.

    There are now like 6 different raytracing options in the PC version of CP77.

    They only work well with Nvidia cards… and the latest iteration of AMD cards, that now have basically their equivalent of RT cores, though they’re not as powerful, nor as expensive, as Nvidia cards.

    Needless to say, PS5… normals, or whatever, are going to be even weaker than the ‘somewhere between an rx 6600 and rx 6700’ rough equivalence.

    Playstation users only even see about 10% of all the graphical settings options that PC players see, really just the 4 post processing effects and motiin blur. PC players get many, many more settings they can tweak around.

    Also, PS5 users generally seem to think CP77 is being rendered at 4k, in quality mode.

    Uh, no, it isn’t, its rendered at 2k max, and that is likely using FSR 2.1,with a bit of dynamic frame upscaling within CP77, meaning the actual real render resolution is roughly 5% to 15 % lower than 2k, then FSR upscales to 2k,… then the PS5 does its own upscaling outside of the game, likely via an emulator style simple integer upscale algo, to display at 4k on a 4k TV.

    … I am saying all of this with confidence because I have spent a considerable amount of time fucking with customizing dynamjc resolution scaling, and other graphical settings, in CP77, on different AMD cards, on PCs.

    The PS5Pro is using the same dynamic resolution scaling that exists in CP77 on PCs. PCs generally do best with an ‘Auto’ mode for various frame upscalers, but the dynamic setting on PC basically lets you throw out the auto settings and fuck about with your own custom tolerances for max and min resolutions and frame rates.


  • Sorry, I … well, I was recently diagnosed with PTSD.

    And a significant part of that… is I am so, so, very used to people just misinterpreting what I actually said, then in their heads, they heard /something else/, and then they respond to /something else/, they continue to believe I said /something else/, even after I explain to them that isn’,t what I said, and then they tell everyone else that I said /something else/.

    (there are many other things that go into the PTSD, but they are waaaay outside of the scope of this discussion)

    I, again, realize and aporeciate that you responded to both interpretations…

    But I am just a bit triggered.

    I am so, so, very used to being gaslit by… most of my family, and many, many other people in my life, who just seemingly willfully misinterpret me consistently, or are literally incapable of hearing/reading without just inventing and inserting their own interpretation.

    … Whole lot of my family has very serious mental health disorders, and I’ve also happened to have a very bad run of many bosses and former friends and ex partners who just do the same thing, all the time.

    Took me a long time to just… get away from all these toxic situations, and finally be able to pursue mental health evaluation/treatment on my own accord.

    I’m not saying you ‘intentionally triggered me’ or anything like that, that would be a ridiculous judgement from me, and you have been very polite, and informative… I’m just trying to explain myself, lol.

    As to the actual technical info: yes, everything you are saying lines up with my understanding, its nice to know I know what these words and terms mean in this context, and my understanding is … in line with reality.


  • addie said:

    Integrated memory on a desktop computer is more “partitioned” than shared

    Then I wrote my own reply to them, as you did.

    And then I also wrote this, under your reply to them:

    Can you explain to me what the person you are replying to meant by ‘integrated memory on a desktop pc’?

    And now you are saying:

    So, for starters I never mentioned “integrated memory”, I wrote “integrated graphics”, i.e. the CPU chip comes together with a GPU, either as two dies in the same chip package or even both on the same die.

    I mean, I do genuinely appreciate your detailed, technical explanations of these systems and hardware and their inner functions…

    But also, I didn’t say you said integrated memory.

    I said the person you are replying to, addie, said integrated memory.

    I was asking you to perhaps be able to explain what they meant… because they don’t seem to know what they’re trying to say.

    But now you have misunderstood what I said, what I asked, lol.

    You replied to addie … I think, as if they had written ‘integrated graphics’. But they didn’t say that. They said ‘integrated memory’.

    And… unless I am … really, really missing something… standard desktop PCs… do not have any kind of integrated memory, beyond like… very, very small areas where the mobo bios is stored, but that is almost 100% irrelevant to discussion about video game rendering capabilities.

    As you say, you have to go back 20+ years to find desktop PCs with Mobos that have their own SRAM… everything else is part of the GPU or CPU die, and thus … isn’t integrated. As GPUs and CPUs are removable, swappable, on standard desktop PCs.

    Eitherway, again, I do appreciate your indepth technical info!


  • Can you explain to me what the person you are replying to meant by ‘integrated memory on a desktop pc’?

    I tried to explain why this phrase makes no sense, but apparently they didn’t like it.

    …Standard GPUs and CPUs do not share a common kind of RAM that gets balanced between space reserved for CPU-ish tasks and GPU-ish tasks… that only happens with an APU that uses LPDDR RAM… which isn’t at all a standard desktop PC.

    It is as you say, a hierarchy of assets being called into the DDR RAM by the CPU, then streamed or shared into the GPU and its GDDR RAM…

    But the GPU and CPU are not literally, directly using the actual same physical RAM hardware as a common shared pool.

    Yes, certain data is… shared… in the sense that it is or can be, to some extent, mirrored, parellelized, between two distinct kinds of RAM… but… not in the way they seem to think it works, with one RAM pool just being directly accessed by both the CPU and GPU at the same time.

    … Did they mean ‘integrated graphics’ when they … said ‘integrated memory?’

    L1 or L2 or L3 caches?

    ???

    I still do not understand how any standard desktop PC has ‘integrated memory’.

    What kind of ‘memory’ on a PC… is integrated into the MoBo, unremovable?

    ???


  • Because without full access to your PC, anti-cheat is essentially useless and easily bypassed by cheaters.

    This is false.

    Many functional AntiCheats work well without Kernel Level access… and many Kernel Level AntiCheats… are routinely bypassed by common, easily purchaseable hacks… which, again, only work on Windows.

    I used to play GTA V Online (and RDR2, and FiveM, and RedM…) on linux all the time, literally for years… untill they just decided to ban all linux players.

    Because of cheaters.

    That’s not an idea that anyone is saying though, other than you right now.

    Uh… you are also basically saying this, with that combination of statements.

    So… please refrain from obviously contradictory, gas lighting arguements, thanks!

    Anyway: GTAV uses BattleEye.

    BattleEye works on Linux.

    Rockstar just … chose not to use that Linux support.

    It’s also one of the only ways to try to stop cheaters.

    There are many other ways to stop cheaters that are quite effective, namely, actually designing your game more competently and more cleverly, with less client side authority and more server side authority, less intrusive system client side AC that is more reliant on randomized realtime logging and verifications of game files, server side hereustics that pick up ‘impossible’ player input patterns, etc.

    You know, all the other methods that have been used for decades, and still work.

    No AntiCheat method will ever be 100% effective.

    As I already mentioned, Kernel Level AntiCheats are defeated all the time, and you can easily find and purchase such cheats/hacks… which only work on Windows… after maybe 30 minutes of web searching or jumping around discord communities.

    Beyond that, its not that hard or expensive to setup your own, or just purchase a tiny microcomputer that plugs into your PC, then you plug your mouse/keyboard into that, and then the microPC middleman performs aim hacks and otherwise impossible movement macros like stuttersteps and such.

    Kernel ACs are routinely defeated by competent executions of this concept.

    You can never stop all hackers.

    It is always a trade off of exactly how much you inconvenience and degrade the system integrity/stability/security of the user, versus how many hackers you believe you are likely to stop.

    Kernel Level AntiCheat is basically going to 99.99% effective from previous methods being 99.9% effective… and the cost is literally you are now installing a rootkit on your own system that could very well be reading all your web history and saved logins and passwords.

    The code is black box, and tech companies lie all the time about how much data they gather from you… and then sell to every data broker they can.

    The only actual numbers and statistics anyone has to work with, when justifying or arguing against effectiveness levels of different kinds of AC… are the claims put out by AC companies.

    And even then, most people, such as yourself, aren’t even aware of or refuse to acknowledge that AntiCheats have worked on linux for years.

    It is a “something is wrong with linux” issue if Linux doesn’t allow/provide for something that game developers - and game players - want, which is anti-cheat that does the absolute best it can to stop cheaters.

    I see how you just completely did not address how I stated that EAC and BattleEye both support linux, other ACs have and still do as well… certain game publishers just don’t use these features that have existed for years.

    Valve Anti Cheat, for example?

    You can find more info if you look, but I’m guessing you won’t.

    You just have an idea, of ‘the idea’.

    Have you ever written a hack?

    Written game netcode, and other client/server game code?

    … I have! … back when I still used Windows, ironically.

    Best way to test your own design is to try to defeat it.

    Installing a rootkit onto your personal computer… to protect you from hackers in a game… is like trying to fight a stomach flu you got from Taco Bell by intentionally infecting yourself with Covid.

    Oh and uh, after the whole… CrowdStrike fiasco, where Windows just allowed CrowdStike to write and update kernel level code without actually doing their own testing or validation… and then they pushed a bad update… and that took out basically 1/4 of the world’s enterprise Windows machines for a week or two?

    Yeah… Windows is now removing direct kernel level access from third party software.

    They’re making everything move up a level or two, kind of inventing a new interface layer/paradigm…

    Becauase it is in fact, empirically, objectively, way, way, waaaay too dangerous to just let every ‘verified’ third party partner fuck with the kernel.

    So… your idea of ‘the idea’ of Kernel Level AC is no longer valid, as it is no longer able to run at such a low layer, and will thus be more vulnerable to… the kinds of hacks Kernel Level AC is supposed to be necessarry for dealing with.


  • Most Call of Duty games work on linux, you’re gonna have to be more specific as to which particular one of like 25 you mean by ‘COD’.

    The ones that don’t, they don’t work because the devs are too lazy or incompetent (or specifically told not to by their bosses) to make an AntiCheat that isn’t a rootkit with full access to your entire PC.

    I used to play GTA V Online (and RDR2, and FiveM, and RedM…) on linux all the time, literally for years… untill they just decided to ban all linux players.

    IMO they owe me money for that, but oh well I guess.

    Again, there are many AntiCheats that work on linux, and have worked on linux for years and years now.

    Easy Anti Cheat and Battleeye even offer linux support to game devs. There are some games with these ACs that actually do support linux.

    But many game devs/studios/publishers just don’t use this support… because then there wouldn’t be any reason to actually use Windows, and MSFT pays these studios a lot of money… or they just literally own them (Activision/Blizzard = MSFT).

    Kernel Anti Cheat that only works on Windows?

    Yep, that’s just a complicated way to enforce Windows exclusivity in PC games.

    Go look up how many hacks and trainers you can find for one of these games you mention.

    You may notice that they are all designed for, and only work on… Windows.

    The idea that all linux gamers are malicious hackers is a laughable, obviously false idea… but game company execs understand the power of rabid irrational fandoms.

    You are right that you can’t run games with rootkit anticheats on linux though, so if those heavily monetized and manipulative games with toxic playerbases are your addiction of choice, yep, sorry, linux ain’t your hookup for those.

    Again, this is another game platform freedom advocacy issue, and also a personal information security advocacy issue, not a ‘something is wrong with linux’ issue.

    Game companies have gotten many working anticheat systems to work with linux. The most popular third party anticheat systems also support linux.

    But the industry is clever at keeping people locked into their for profit, insecure OSs that spy on their entire system.


  • I… uh… what?

    Integrated memory, on a desktop PC?

    Genuinely: What are you talking about?

    Typical PCs (and still many laptops)… have a CPU that uses the DDR RAM that is… plugged into the Mobo, and can be removed. Even many laptops allow the DDR RAM to be removed and replaced, though working on a laptop can often be much, much more finnicky.

    GPUs have their own GDDR RAM, either built into the whole AIB in a desktop, or inside of or otherwise a part of a laptop GPU chip itself.

    These are totally different kinds of RAM, they are accessed via distinct busses, they are not shared, they are not partitioned, not on desktop PCs and most laptops.

    They are physically and design distinct, set aside, and specialized to perform with their respective processor.

    The kind of RAM you are talking about, that is shared, partitioned, is LPDDR RAM… and is incompatible with 99% of desktop PCs

    Also… anything, on a desktop PC, that gets loaded and processed by the GPU… does at some point, have to go through the CPU and its DDR RAM first.

    The CPU governs the actual instructions to, and output from, the GPU.

    A GPU on its own cannot like, ask an SSD or HDD for a texture or 3d model or shader.

    Normally, compressed game assets are loaded from the SSD to RAM via the Win32 API. Once in RAM, the CPU then decompresses those assets. The decompressed game assets are then moved from RAM to the graphics card’s VRAM (ie, GDDR RAM), priming the assets for use in games proper.

    (addition to the quote is mine)

    Like… there is GPU Direct Storage… but basically nothing actually uses this.

    https://www.pcworld.com/article/2609584/what-happened-to-directstorage-why-dont-more-pc-games-use-it.html

    Maybe it’ll take off someday, maybe not.

    Nobody does dual GPU SLI anymore, but I also remember back when people thought multithreading and multicore CPUs would never take off, because coding for multiple threads is too haaaaarrrrd, lol.

    Anyway, the reason that emulators have problems doing the things you describe consoles a good at… is because consoles have finetuned drivers that work with only a specific set of hardware, and emulators have to reverse engineer ways of doing the same, which will work on all possible pc hardware configurations.

    People who make emulators generally do not have direct access to the actual proprietary driver code used by console hardware.

    If they did, they would much, much more easily be able to… emulate… similar calls and instruction sets on other PC hardware.

    But they usually just have to make this shit up on the fly, with no actual knowledge of how the actual console drivers do it.

    Reverse engineering is astonishingly more difficult when you don’t have the source code, the proverbial instruction manual.

    Its not that desktop PC architecture … just literally cannot do it.

    If that were the case, all the same issues you bring up that are specific to emulators… would also be present with console games that have proper ports to PC.

    While occasionally yes, this is sometimes the case, for some specific games with poor quality ports… generally no, not this is not true.

    Try running say, an emulated Xbox version of Deus Ex: Invisible war, a game notoriously handicapped by its console centric design… try comparing the PC version of that, on a PC… to that same game, but emulating the Xbox version, on the same exact PC.

    You will almost certainly, for almost every console game with a PC port… find that the proper PC version runs better, often much, much better.

    The problem isn’t the PC’s hardware capabilities.

    The problem is that emulation is inefficient guesswork.

    Like, no shade at emulator developers whatsoever, its a miracle any of that shit works at all, reverse engineering is astonishingly difficult, but yeah, reverse engineering driver or lower level code, without any documentation or source code, is gonna be a bunch of bullshit hacks that happen to not make your PC instantly explode, lol.


  • Hrm uh… Framework laptops… seem to be configurable as having a mobile grade CPU with integrated graphics… and also an optional, additional mobile grade, dedicated GPU.

    So, not really an APU… unless you really want to haggle over definitions and say ‘technically, a CPU with pathetic integrated graphics still counts as a GPU and is thus an APU’.

    Framework laptop boards don’t have the PCI-E 16x slot for a traditional desktop GPU. As far as I am aware, Minisforum are the only people that do that, along with a high powered mobile CPU.

    Note that the Minisforum Mobo model I am talking about, the AMD chip is not really an APU, its also a CPU with integrated graphics. Its a Radeon 610M, basically the bare minimum to be able to render and output very basic 2d graphics.

    True APUs are … things like what more modern consoles use, what a steam deck uses. They are still usually custom specs, proprietary to their vendor.

    The Switch 2 will have a custom Nvidia APU, which is the first Nvidia APU of note to my knowledge, and it will be very interesting to learn more about it from teardowns and benchmarks.

    Currently, the most powerful, non custom, generally publically available, compatible with standard PC mobos… arguably an APU, arguably not… is the AMD 8700G.

    Its about $315 bucks, is a pretty decent CPU, but as a GPU… its less powerful than a standard desktop RX 6500 from AMD… which is the absolute lowest tier AMD GPU from now two generations back from current.

    You… might be able to run … basically games older than 5ish years, at 1080p, medium graphics, at 60fps. I guess it would maybe be a decent option if you… wanted to build a console emulator machine, roughly for consoles … N64/PS1/Dreamcast, and older, as well as being able to play older PC games, or PC games at lower settings/no more than 1080p.

    I am totally just spitballing with that though, trying to figure out all that exactly would be quite complicated.

    But now, back to Framework.

    Framework is soon to be releasing the Framework Desktop.

    This is a small form factor PC… which uses an actual proper APU, either the AMD AI Max 385 or 395.

    Its listed as MSRP of $1100, they say it can run Cyberpunk at 1440p on high settings at about 75 fps… thats with no ray tracing, no framegen… and I think also no frame upscaling being used.

    So, presumably, if you turned on upscaling and framegen, you’d be able to get similar fps at ultra and psycho settings, and/or some amount of raytracing.

    There are also other companies that offer this kind of true APU, MiniPC style architecture, such as EvoTek, though it seems like most of them are considerably more expensive.

    https://wccftech.com/amd-ryzen-ai-max-395-strix-halo-mini-pc-tested-powerful-apu-up-to-140w-power-128-gb-variable-memory-igpu/

    … And finally, looks like Minisforum is sticking with the laptop CPU + desktop GPU design, and is soon going to be offering even more powerful CPU+Mobo models.

    https://wccftech.com/minisforum-ryzen-9-9955hx-x870m-motd-motherboard-9955hx-ms-a2-mini-pc-strix-nas/

    So yeah, this is actually quite an interesting time of diversification away from … what have basically been standard desktop mobo architectures… for … 2, 3? decades…

    …shame it all also coincides with Trump throwing a literally historically unprecedented senilic temper tantrum, and fucking up prices and logistics for… basically the whole world, though of course much, much more seriously for the US.


  • Yep.

    And after enough people can no longer actually critically think, well, now this shitty AI tech does actually win the Turing Test more broadly.

    Why try to clear the bar when you can just lower it instead?

    … Is it fair, at this point, to legitimately refer to humans that are massively dependant on AI for basic things… can we just call them NPCs?

    I am still amazed that no one knows how to get anywhere around… you know, the town or city they grew up in? Nobody can navigate without some kind of map app anymore.


  • Basically this is true, yes, without going into an exhaustive level of detail as to very, very specific subtypes and specs of different RAM and mobo layouts.

    Shared memory setups generally are less powerful, but, they also usually end up being overall cheaper, as well as having a lower power draw… and being cooler, temperature wise.

    Which are all legitimate reasons those kinds of setups are used in smaller form factor ‘computing devices’, because heat managment, airflow requirements… basically rule out using a traditional architecture.

    Though, recently, MiniPCs are starting to take off… and I am actually considering doing a build based on the Minisforum BD795i SE… which could be quite a powerful workstation/gaming rig.

    Aside about interesting non standard 'desktop' potential build

    This is a Mobo with a high end integrated AMD mobile CPU (7945hx)… that all together, costs about $430.

    And the CPU in this thing… has a PassMark score… of about the same as an AMD 9900X… which itself, the CPU alone, MSRPs for about $400.

    So that is kind of bonkers, get a high end Mobo and CPU… for the price of a high end CPU.

    Oh, I forgot to mention: This BD795iSE board?

    Yeah it just has a standard PCI 16 slot. So… you can plug in any 2 slot width standard desktop GPU into it… and all of this either literally is, or basically is the ITX form factor.

    So, you could make a whole build out of this that would be ITX form factor, and also absurdly powerful, or a budget version with a dinky GPU.

    I was talking in another thread a few days ago, snd somekne said PC architecture may be headed toward… basically you have the entire PC, and the GPU, and thats the new paradigm, instead of the old school view of: you have a mobo, and you pick it based on its capability to support future cpus in the same socket type, future ram upgrades, etc…

    And this intrigued me, I looked into it, and yeah, this concept does have cost per performance merit at this point.

    So this uses a split between the GPU having its GDDR RAM and the… CPU using DDDR SODIMM (laptop form factor) RAM.

    But its also designed such that you can actually fit huge standard PC style cooling fans… into quite a compact form factor.

    From what I can vaguely tell as a non Chinese speaker… it seems like there are many more people over in China who have been making high end, custom, desktop gaming rigs out of this laptop/mobile style architecture for a decent while now, and only recently has this concept even really entered into the English speaking world/market, that you can actually build your own rig this way.


  • https://www.metacritic.com/pictures/best-playstation-games-of-2024/

    Works on Linux:

    Prince of Persia, the Lost Crown

    Silent Hill 2 (Remake)

    Marvel vs Capcom: Arcade Classics

    Shin Megamei Tensei (V)engeance

    Persona 3 Reload

    HiFi Rush

    Animal Well

    Castlevania Dominus Collection

    Like A Dragon: Infinite Wealth

    Tekken 8

    The Last of Us Part II (Remaster)

    Balatro

    Dave the Diver

    Slay the Princess: Pristine Cut

    Metaphor Re Fantazio

    Elden Ring: Shadow of the Erdtree (and base game)

    Does not work on Linux:

    Unicorn Overlord (Console Exclusive, No PC Port Allowed by Publisher Vanillaware)

    Destiny 2 (Kernel Level Anti Cheat)

    FF VII Rebirth (PS Exclusive)

    Astro Bot (PS Exclusive)

    Damn, yeah, still consoles gotta hold on via exclusives, I guess?

    And then there’s the mismanaged shitshow that is Destiny 2…

    …who can’t figure out how to do AntiCheat without installing a rootkit on your PC, despite functional, working AntiCheats having worked on linux games for at least half a decade at this point, if not longer…

    …nor can they figure out how to write a storyline that rises above ‘everyone is always lore dumping instead of talking, and also they talk to you like a you’re a 10 year while doing so.’

    Last I heard, a whole bunch of hardcore D2 youtubers and streamers were basically all quitting out of frustration and feeling let down or betrayed by Bungie.

    Maybe we should advocate for some freedom of platform porting/publishing for all games, eh FreedomAdvocate?


  • It’s shared memory, so you would need to guarantee access to 16gb on both ends.

    So… standard Desktop CPUs can only talk to DDR.

    ‘CPUs’ can only utilize GDDR when they are actually a part of an APU.

    Standard desktop GPUs can only talk to GDDR, which is part of their whole seperate board.

    GPU and CPU can talk to each other, via the mainboard.

    Standard desktop PC architecture does not have a way for the CPU to directly utilize the GDDR RAM on the standalone GPU.

    In many laptops and phones, a different architecture is used, which uses LPDDR RAM, and all the LPDDR RAM is used by the APU, the APU being a CPU+GPU combo in a single chip.

    Some laptops use DDR RAM, but… in those laptops, the DDR RAM is only used by the CPU, and those laptops have a seperate GPU chip, which has its own built in GDDR RAM… the CPU and GPU cannot and do not share these distinct kinds of RAM.

    (Laptop DDR RAM is also usually a different pin count and form factor than desktop PC DDR RAM, you usually can’t swap RAM sticks between them.)

    The PS5Pro appears to have yet another unique architecture:

    Functionally, the 2GB of DDR RAM can only be accessed by the CPU parts of the APU, which act as a kind of reserve, a minimum baseline of CPU-only RAM set aside for certain CPU specific tasks.

    The PS5Pro’s 16 GB of GDDR RAM is sharable and usable by both the CPU and GPU components of the APU.

    So… saying that you want to have a standard desktop PC build… that shares all of its GDDR and DDR RAM… this is impossible, and nonsensical.

    Standard desktop PC motherboards, compatible GPUs and CPUs… they do not allow for shareable RAM, instead going with a design paradigm of the GPU has its own onboard GDDR RAM that only it can use, and DDR RAM that only the CPU can use.

    You would basically have to tear a high end/more modern laptop board with an APU soldered into it… and then install that into a ‘desktop pc’ case… to have a ‘desktop pc’ that shares memory between its CPU and GPU components… which both would be encapsulated in a single APU chip.

    Roughly this concept being done is generally called a MiniPC, and is a fairly niche thing, and is not the kind of thing an average prosumer can assemble themselves like a normal desktop PC.

    All you can really do is swap out the RAM (if it isnt soldered) and the SSD… maybe I guess transplant it and the power supply into another case?

    I don’t know how you could arrive at such a conclusion, considering that the base PS5 has been measured to be comparable to the 6700.

    I can arrive at that conclusion because I can compare actual bench mark scores from a nearest TFLOP equivalent, more publically documented, architecturally similar AMD APU… the 7600M. I specifically mentioned this in my post.

    This guy in the article here … well he notes that the 6700 is a bit more powerful than the PS5Pro’s GPU component.

    The 6600 is one step down in terms of mainline desktop PC hardware, and arguably the PS5Pro’s performance is… a bit better than a 6600, a bit worse than a 6700, but at that level, all of the other differences in the PS5Pro’s architecture give basically a margin of error when trying to precisely dial in whether a 6700 or 6600 is a closer match.

    You can’t do apples to apples spec sheet comparisons… because, as I have now exhaustively explained:

    Standard desktop PCs do not share RAM between the GPU and CPU. They also do not share memory imterface busses and bandwidth lanes… in standard PCs, these are distinct and seperate, because they use different architectures.

    I got my results by starting with the (correct*) TFLOPs output from a PS5Pro, finding a nearest equivalent APU with PassMark benchmark scores, reported by hundreds or thousands or tens of thousands of users, then compared those PassMark APU scores to PassMark conventional GPU scores, and ended up with ‘fairly close’ to an RX 6600.

    • The early, erroneous reporting of the TFLOPs score as roughly 33, when it was actually closer to 16 or 17… that stemmed from reporting a 16 digit FLOP score/test, when the more standard convention is to list the 32 digit FLOP score/test.

    You, on the other hand, just linked to a Tom’s Hardware review of currently in production desktop PC GPUs… which did not make any mention of the PS5Pro… and them you also acted as if a 6600 was half as powerful as a PS5Pro’s GPU component… which is wildly off.

    A 6700 is nowhere near 2x as powerful as a 6600.

    2x as poweful as an AMD RX 6600… would be roughly an AMD RX 7900 XTX, the literal top end card of AMDs previous GPU generation… that is currently selling for something like $1250 +/- $200, depending on which retailer you look at, and their current stock levels, and which variant of which partner mfg you’re going for.


  • GPU prices are ridiculous, but those GPUs are also ridiculously more powerful than anything in any console.

    The rough equivalent to a PS5Pro’s GPU component is a … not current gen, not last gen, but the gen before that… find AMD’s weakest GPU model in the 6 series, the RX 6600, and that is roughly the same performance as the GPU performance of a PS5Pro.

    The Switch 2 may have an interesting, custom mobile grade Nvidia APU, but at this point, its not out yet, no benchmarks, etc.

    Oh right also: If GPU prices for PCs remain elevated… well, any future consoles will also have elevated prices. Perhaps not to the same degree, but again, that will be because a console will be basically fairly low tier if you compared it to the range of PC hardware… and console mfgs can subsidize console costs with game sales… and they get discounts on ordering the components that go into their consoles by ordering in huge bulk.


  • Ok so, for starters, your ‘reported equivalent’ source is wrong.

    https://www.eurogamer.net/digitalfoundry-2024-playstation-5-pro-weve-removed-it-from-its-box-and-theres-new-information-to-share

    The custom AMD Zen2 APU (combined CPU + GPU, as is done in laptops) of a PS5Pro is 16.7 TFLOPs, not 33.

    So your PS5 Pro is actually roughly equivalent to that posted build… by your ‘methodology’, which is utterly unclear to me, what your actual methodolgy for doing a performance comparison is.

    The PS5 Pro uses 2 GB of DDR5 RAM, and 16 GB of GDDR6 RAM.

    This is… wildly outside of the realm of being directly comparable to a normal desktop PC, which … bare minimum these days, has 16 GB DDR4/5 RAM, and the GDDR6 RAM would be part of the detachable GPU board itself, and would be … between 8GB … and all the way up to 32 if you get an Nvidia 5090, but consensus seems to be that 16 GB GDDR6/7 is probably what you want as a minimum, unless you want to be very reliant on AI upscaling/framegen, and the input lag and whatnot that comes with using that on an underpowered GPU.

    Short version: The PS5Pro would be a wildly lopsided, nonsensical architecture to try to one to one replicate in a desktop PC… 2 GB system RAM will run lightweight linux os’s, but not a chance in hell you could run Windows 10 or 11 on that.

    Fuck, even getting 7 to work with 2GB RAM would be quite a challenge… if not impossible, I think 7 required 4GB RAM minimum?

    The closest AMD chip to the PS5 Pro that I see, in terms of TFLOP output… is the Radeon 7600 Mobile.

    ((… This is probably why Cyberpunk 2077 did not (and will never) get a ‘performance patch’ for the PS5Pro: CP77 can only pull both high (by console standards) framerates at high resolutions… and raytracing/path tracing… on Nvidia mobile class hardware, which the PS5Pro doesn’t use.))

    But, lets use the PS5Pro’s ability to run CP77 at 2K60fps on … what PC players recognize as a mix of medium and high settings… as our benchmark for a comparable standard PC build. Lets be nice and just say its the high preset.

    (a bunch of web searching and performance comparisons later…)

    Well… actually, the problem is that basically, nobody makes or sells desktop GPUs that are so underpowered anymore, you’d have to go to the used market or find some old unpurchased stock someone has had lying around for years.

    The RX 6600 in the partpicker list is fairly close in terms of GPU performance.

    Maybe pair it with an AMD 5600X processor if you… can find one? Or a 4800S, which supposedly actually were just rejects/run off from the PS5 and Xbox X and S chips, rofl?

    Yeah, legitimately, the problem with trying to make a PC … in 2025, to the performance specs of a PS5 Pro… is that basically the bare minimum models for current and last gen, standard PC architecture… yeah they just don’t even make hardware that weak anymore.

    EDIT:

    oh final addendum: if your tv has an hdmi port, kablamo, thats your monitor, you dont strictly need a new one.

    And there are also many ways to get a wireless or wired console style controller to work in a couch pc setup.



  • I can… and I have… and this has resulted in destabilzation.

    … This is why I am asking for help, if anyone has figured this out… and why I am not asking for permission to continue to flail about ineffectively.

    As far as I can tell, as ludicrous as it seems… setting up a distrobox with an actual mainline fedora build, then configuring it as a dev enviroment, then building an rpm package for i2p, from source inside this container… and then installing that static rpm into actual Bazzite OS…

    That would probably at least be more stable for Bazzite as a whole, just feeding it a single, extra, static package, as compared to source dependency hell…

    But I have no idea if I2P would… actually compile correctly… and… work.

    Although, I have managed to build Godot, a few versions ago, doing this, just as an experiment… and it … seemed to… mostly work?

    ???

    There were lots of fun unique error messages in the console that just did not exist anywhere else online.


  • Nobara’s handheld … ‘edition’, is a very, very rough ride in basic usability compared to Bazzite.

    I tried it a month or so back, and it is just constantly asking for admin passwords to re-enable the basic gamescope overlay whenever you do amything to the system.

    It asks you… to type in a password… after gamescope has been disabled… which means you have no keyboard.

    A workaround to this is to hold the steam button on a deck and go into big picture mode, then use the joycons and buttons to kill steam, then you restart steam, then you can now type in the password.

    … But you will have to repeat this process over and over and over again while in desktop mode.

    This is what I would call unusable as a handheld PC OS. GE has to… actually figure out how to make it work when it is just a handheld, otherwise, you do not have a handheld OS.