

Yup, there are few efficient ways to handle that, so anything that does it looks something like everything else that handles it.
Sadly, not many things handle it :)
WYGIWYG
Yup, there are few efficient ways to handle that, so anything that does it looks something like everything else that handles it.
Sadly, not many things handle it :)
Plex does this on its own. It’s one of the features they provide. The client/service knows when the server is local even though you go outside to make the initial connection. They go through a lot of trouble to do this. You connect externally it brokers the initial connection proxies date of back and forth to see if you can talk to each other directly, your client knows your server is now local and it switches over.
I don’t know if any other video hosting package that does this. Jellyfin certainly would not. I ‘think’ if you threw a tailscale in the middle, It would be able to do it without hair pinning as long as you were using a local exit node instead of tailnet. They’d still probably go through that local exit node.
As people have said, the Intel CPU with quick sync will be much better on power.
You could also use your m.2 to caache your regular hard drive with BTRFS and LVM or something like https://bcache.evilpiepirate.org/
Maybe spin down your HDD when it’s not being used. Most of your power savings are going to come from not transcoding unless you need to, transcoding efficiently when you need to, and powering things down when you don’t need them.
In Linux you can mess with your clock regulation, probably even put the box to sleep when you don’t need it, maybe wake on lan.
notes: Facer is a little bit of a PITA to install, follow the instructions very well.
GW4: It eats about twice as much battery as my normal faces, but I’m still lasting the day.
The text is WAY too small, but it’s pretty fun.
I’m not going to be bullied by liberal arts partisans into reconfiguring how my brain works
I hate to tell you this, but that’s a fascist argument using tradition to block out change and accepting others. It’s a screw those other races/religions history and feelings because it makes me feel less powerful kind of statement. I doubt that’s your intent, but there it is.
https://www.etymonline.com/word/master
contrastive adjective (“he who is greater”)
This is not to say that the term predates slavery and is neutral today in pejorative terms.
Music used it because the original is of greater quality. The term is technically and syntactically correct here.
Slavery used the term correctly, extensively and horribly. Honestly, it tainted it.
Most of the people who say it’s no big deal or they don’t care have ancestors who were on the unimpacted or positively impacted side of slavery. Very “let them eat cake” tones. (even though that story itself is a misnomer)
To be perfectly honest, the term in its etymological roots doesn’t fit well in the digital age for common use cases. It’s fallen into common parlance from the analog era, when it had a more direct meaning. Even though it’s not regularly being used as a pejorative, there’s (not zero, but) little harm in slowly phasing it out for better, more accurate terms like main or trunk or origin.
never a free lunch :(
thanks!
ok damnit, i’ll install facer :)
That was a little bit of a fight to install, but not miserable.
Wish the font was a touch larger and it would be cool to have some complications.
still, pretty fire.
deleted by creator
You don’t really want to live transcode 4K. That’s a tremendous amount of horsepower required to go real time. When you rip your movies you want to make sure they’re in some format that whatever player you’re using can handle. If that means that you use a streaming stick in your TV instead of the app on your TV that’s what you do. I think you could technically do it with a 10th+ gen Intel with embedded video. I know that a Nvidia 2070 super on a 7th gen Intel will not get the job done for an upper and Roku. So all of my 4K video is either h264 or HEVC so it all direct plays on my flavor of Roku.
Fork their project :) CLI clients are easy enough to use on PC Mac and Linux, All we need is for someone to build Android and iOS clients.
They have enough open source code out there to make the CLI clients and server.
It could be forked right now and turned into a separate project. BSD3 license. Rerelease with modifications. A couple of multi-platform devs could manage it.
DAS is 1:1, It’s more or less like just connecting en external hard drive to your computer.
SAN can do some crazier stuff. You can take arrays and attach them to LUN’s and to sign luns to separate computers. You have fiber optic routing and virtual networks, sometimes iSCSI. But that stuff is extremely expensive and power hungry and did I mention extremely expensive
NAS is basically just a computer with disks attached to it sharing the data through one of her protocols you need.
For home gaming, even sharing with a extended family, truenas, unraid, or just a computer with ZFS is ideal.
ZFS is the elite but slightly harder way to do it. Your volumes all need to be the same size even if your disks are different sizes. There’s regular maintenance that needs to be applied, But it’s very fast and very flexible and very easy to expand.
Unraid is very slow but very flexible, the discs aren’t in a raid they’re in a JBOD, so really really slow, But if you lose one disc all you’ve lost is the data on that disk, and you can run up to two parity discs. As long as your parity drives are larger than your largest data drive.
Truenas is more of an unraid type situation but with a ZFS. Both unraid and truenas support virtualization and/or containers for running applications and give you nice metrics and meters and stuff.
You can hand roll with Debian, ZFS, docker and proxmox.
I think DAS is pretty much dead. If you have a ton of ephemeral data, and you need to do high speed work on it It’s a reasonable solution. But I think for the most part eight terabyte nvme has made it pretty niche.
Porque no los dos?
There is no functional difference between them scraping you systematically and them coming to you on behalf of user. They’re coming to scrape you either way, being asked by someone is just going to make them do it in a smarter fashion.
Also, if you’re not using Gemini, damned if Google.com doesn’t search you with it anyway. They want these AIs trained bad, sooner or later almost all searching will be done through AI. There will eventually be no option.
You are correct that blocking all AI calls well eventually make your search results not work.
So if you want organic traffic, you have to allow ai scraping eventually. You’re just going to get diminishing returns until a point.
Oh, Plex has the risk. A vulnerability in Plex is how LastPass lost all their source code. A vulnerability in Tautulli which he had ported outside surfaced his auth token, then he was able to use the auth token to get into Plex and they were able to hit an rce vulnerability and pull the entire git repo the guy had locally.
The key difference is Plex at least has a security team and their name on the line with their investors.
but, think of it… RACING STRIPES!!! or FLAMES!!!
You use bamboo skewers to mount the things off the bottom and dampen vibration. mabey use an internal flap and bent the disks out the front and the PSU out the back. If you have enough cardboard, you could even bend it a bit and do like a jet engine with the fan sticking out the front.
cardboard papercraft homelab… I almost want to get rid of my 42 U rand and make a voltron now.
Just needs a 10" cardboard box with proper holes
A lot of neophyte self hosters Will try running the binary in Windows instead. Experienced self hosters will indeed use docker.
Then out of the ones that are using docker some of them will set it up as privileged.
And then how many of those people actually make read-only versus how many just add the path and don’t think about it.
Don’t confuse your good practices with what the average person will do.
I’ve heard jellyfin has a lot of security issues
The biggest known stuff I saw on their GitHub is that a number of the exposed service URLs under the hood don’t require auth. So, it’s open-source with known requirements, you can tell easily from the outside that it’s running, and you can cause it to activate a LOT of packages without logging in. That’s a zero-day in any package that can be passed a payload away from disaster.
AS far as TVOS, I’m kinda surprised swiftfin doesn’t service you.
I did Linux on the desktop for 15 years. I was primarily Windows at home, Linux at work. With a job change, I took a detour through Mac for a couple of years, then WSL hit, and I ran Windows for quite a while.
I dropped back in, but only at home when Bookworm landed. I was playing Steam games with video acceleration right out of the gate. For a lot of people, it’s just going to work right out of the gate, and updates are just going to work. Now that a lot of shit’s going Electron, a lot of apps that had an edge in windows are now identical through their web interfaces.
If you’re not playing games with a lot of anti-cheat, using proprietary hardware or don’t need access to some windows-only apps (or you can put up with Wine), all the distros are up to the point where they operate just as you’d expect them to.