

Yeah, that’d be great. Peltiers would be awesome and everywhere if they were dirt cheap.


Yeah, that’d be great. Peltiers would be awesome and everywhere if they were dirt cheap.


So what malware got shipped?


Awesome, thanks for the info and source.
Yeah, most of my frustration came from JXL/AVIF/HEIF and how linux/Windows browsers, KDE, and Windows 11 don’t seem to support them well. Not a fan of packing HDR into 8-bits with WebP/JPG, especially with their artifacts, though I haven’t messed with PNG yet.


Also, we haven’t even got HDR figured out.
I’m still struggling to export some of my older RAWs to HDR. Heck, Lemmy doesn’t support JPEG XL, AVIF, TIFF, HEIF, nothing, so I couldn’t even post them here anyway. And even then, they’d probably only render right in Safari.


8K is theoretically good as “spare resolution,” for instance running variable resolution in games and scaling everything to it, displaying photos with less scaling for better sharpness, clearer text rendering, less flickering, stuff like that.
It’s not worth paying for. Mostly. But maybe some day it will be cheap enough to just “include” with little extra cost, kinda like how 4K TVs or 1440p monitors are cheap now.
I hear this all the time.
Yet when I bring up features that don’t work at all on X because it’s ancient, “no, thats superfluous. No one needs that.”
I used to do this, but literally just switched to discrete Nvidia yesterday.
Zero issues so far. TBH it actually fixed issues I had with HDR and video decoding on my AMD IGP.


Yeah, I’m not against the idea philosophically. Especially for security. I love the idea of containerized isolation.
But in reality, I can see exactly how much disk space and RAM and CPU and bandwidth they take, heh. Maintainers just can’t help themselves.


It’s why subscribing to YouTube premium within iOS app costs more because Google just makes up the difference that way.
I wish more businesses would do this. If a middleman is taking a cut, make it crystal clear to your customers.
This is my biggest gripe with Steam, for example. Valve can do 30% if they want, but dictating minimum prices on other stores that have nothing to do with Steam is just monopolistic.


And there goes your engagement.
Apples got them, and Apple knows it.


I find the overhead of docker crazy, especially for simpler apps. Like, do I really need 150GB of hard drive space, an extensive poorly documented config, and a whole nested computer running just because some project refuses to fix their dependency hell?
Yet it’s so common. It does feel like usability has gone on the back burner, at least in some sectors of software. And it’s such a relief when I read that some project consolidated dependencies down to C++ or Rust, and it will just run and give me feedback without shipping a whole subcomputer.


It’s propaganda, period.
Every top news post in .world is some tabloid outlet, reposting another source, and mods do nothing about it.
Normally, I’m fine with that “old internet” feel of craziness flying around everywhere, but mimicking Reddit’s structure so closely makes things feel less diverse/discoverable, and more like echo chambers blotting out the sun… which is exactly how previous Reddit alternatives died.


“It’s just a meme,” has the same energy as a confronted schoolyard bully, and apparently that’s just accepted etiquette now :/
Ive seen the same sentiment on Lemmy. Manipulative/misleading articles or even straight up misinformation is posted, and when I bring it up, OP’s response is “I don’t care.” As long as it’s the right ideology, it’s alright; and mods didn’t disagree.
…We’re so screwed, aren’t we? And by “we” I mean the internet. It’s nice to think of the Fediverse an oasis from all this, but it engineered in the same structural issues commercial social media has, I think.


That’s very distant! Yeah, you’re right about size then:
https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship


Oh man, you’re missing out on OLED in a basement though. It’s so fantastic for dimly lit evironments that I’d take the smaller size any day, and just sit a little closer. LCDs, on the other hand, look pretty terrible in a really dark room.
The only thing that would make me pause is if you’re trying to squeeze a big family around the TV. In that case, it would make sense to get a bigger one, so everyone can sit farther back without compromised viewing angles.


TVs are fantastic monitors.
It sounds reasonable to not want to pay for basically a small computer inside the TV.
But in practice, its not that expensive of a component. And TV volumes are so high that they’re bigger and cheaper and higher quality than an equivalently priced monitor, anyway.
Hence, while I’m fine with the monitors I have, I’m never buying a “monitor” again. It just makes no financial sense when I can get a 40" 4K TV with 120hz VRR instead, that happens to work fantastically as a streaming box too.


As a real life example, the Canon 600mm F11 telephoto lens should be awful, on, say, a 32MP crop sensor R7. That’s insane pixel density somewhere in the ballpark of this Fuji.
…But look at real life shots of that exact combo, and they’re sharp as hell. Sharper than a Sigma at F6.3.
The diffraction limit is something to watch out for, but in reality, stuff like the lens imperfections, motion blur, atmospheric distortion and such are going to get you first. You don’t need to shoot at F4 on this thing to make use of the resolution, even if that is the ideal scenario.


See: https://www.omnicalculator.com/other/hyperfocal-distance
https://en.wikipedia.org/wiki/Hyperfocal_distance
But TL;DR: for distant landscapes on a wide field of view lens, you can shoot at F5.6 and everything is in focus. I even do this at F1.4 on my lowly aps-c camera.
Put more concretely, the hyperfocal distance for a 35mm f5.6 lens on a medium format camera is 5 meters. Everything in the distance can be in focus.
For portraits, you want background blur anyway.
And if you’re doing anything else on a medium format camera, you’re kind of insane, heh.


Sony’s modern OLEDs are sick. There are a few between my family, and they have the best processing I’ve seen, they decode massive bluray rips no problem, and native options for a clean ad-free UI.
Why TF aren’t people buying them?
That’s a huge generalization, and it depends what you use your system for. Some people might be on old threadripper workstations that works fine, for instance, and slaps in a second GPU. Or maybe someone needs more cores for work; they can just swap their CPU out. Maybe your 4K gaming system can make do with an older CPU.
I upgraded RAM and storage just before the RAMpocalypse, and that’s not possible on many laptops. And I can stuff a whole bunch of SSDs into the body and use them all at once.
I’d also argue that ATX desktops are more protected from anti-consumer behavior, like soldered price-gouged SSDs, planned obsolescence, or a long list of things you see Apple do.
…That being said, there’s a lot of trends going against people, especially for gaming:
There’s “initial build FOMO” where buyers max out their platform at the start, even if that’s financially unwise and they miss out on sales/deals.
We just went from DDR4 to DDR5, on top of some questionable segmentation from AMD/Intel. So yeah, sockets aren’t the longest lived.
Time gaps between generations are growing as silicon gets more expensive to design.
…Buyers are collectively stupid and bandwagon. See: the crazy low end Nvidia GPU sales when they have every reason to buy AMD/Intel/used Nvidia instead. So they are rewarding bad behavior from companies.
Individual parts are more repairable. If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.
You can still keep your PSU, case, CPU heating, storage and such. It’s a drop in the bucket cost-wise, but it’s not nothing.
IMO things would be a lot better if GPUs were socketable, with LPCAMM on a motherboard.