That’s entirely speculative. There are diminishing returns. Unless you’re going to host your own YouTube, the use case for 50Gbps connections to the home is quite small. 4K video streaming at Ultra HD Blu-ray bitrates doesn’t even come close to saturating 1Gbps, and all streaming services compress 4K video significantly more than what Ultra HD Blu-ray offers. The server side is the limit, not home connections.
Now, if you want to talk about self-hosting stuff and returning the Internet to a more peer-to-peer architecture, then you need IPv6. Having any kind of NAT in the way is not going to work. Connection speed still isn’t that important.
Take a look at devContainers as an idea that might be generalized. It’s just docker containers so so big but not huge however the use case ….
devContainers are a complete portable development environment, with support from major IDEs. Let’s say I want to work on a Java service. I open my IDE, it pulls the latest Java devContainer with my environment and all my tools, fetches the latest from git, and I’m ready to go. The problem with this use case is I’m waiting this whole time. I don’t want to sit around for a minute or two every time I want to edit a program. The latest copy needs to be here, now, as I open my IDE
But you could generalize this idea. Maybe it’s the next ChromeOS-like thing. All you need is something that can run containers, and everything you do starts with downloading a container with everything you need …… if something like this happens, there’s a great example of needing to be responsive with a lot more data
Technically I don’t. I’m also the guy running CI/CD building devContainers for my engineers. They no longer have to worry about updating certificates and tools and versions or security patches, and IT doesn’t have to worry about a lot of crap on their laptops that IT doesn’t manage. Engineers can use a standard laptop install and just get the latest of everything they need, scanned, verified, as soon as it’s available. And since it’s all automated, I can support many variations, and yes they can pull any older version from the repo if they need to, every project can easily be on different versions of different tools and languages
At work, I’m on the same network, but working from home, I still need the responsiveness to do my job
there could be some new thing that no one has not even bothered to think about because of the limitations. Imagine streaming back when downloading few kilobytes for an hours was considered reasonable, people would have laughed at the very thought of it.
We’re not using the bandwidth we have. Many US cities have service with 1Gbps download speed available. I have it for my own reasons. Servers are the bottleneck; they rarely even reach half that speed.
If we’re not using 1Gbps, why should we believe something would pop up if we had 50Gbps?
Now, direct addressing where everyone can be a server and bandwidth utilization is spread more towards the edges of the network? Then you have something that could saturate 1Gbps. But you can’t do that on IPv4.
We are not even filling out the bandwidth of pipes we have to the home right now. “If you build it, they will come” does not apply when there’s already something there that isn’t being fully utilized.
That’s entirely speculative. There are diminishing returns. Unless you’re going to host your own YouTube, the use case for 50Gbps connections to the home is quite small. 4K video streaming at Ultra HD Blu-ray bitrates doesn’t even come close to saturating 1Gbps, and all streaming services compress 4K video significantly more than what Ultra HD Blu-ray offers. The server side is the limit, not home connections.
Now, if you want to talk about self-hosting stuff and returning the Internet to a more peer-to-peer architecture, then you need IPv6. Having any kind of NAT in the way is not going to work. Connection speed still isn’t that important.
Take a look at
devContainers
as an idea that might be generalized. It’s just docker containers so so big but not huge however the use case ….devContainers
are a complete portable development environment, with support from major IDEs. Let’s say I want to work on a Java service. I open my IDE, it pulls the latest Java devContainer with my environment and all my tools, fetches the latest from git, and I’m ready to go. The problem with this use case is I’m waiting this whole time. I don’t want to sit around for a minute or two every time I want to edit a program. The latest copy needs to be here, now, as I open my IDEBut you could generalize this idea. Maybe it’s the next ChromeOS-like thing. All you need is something that can run containers, and everything you do starts with downloading a container with everything you need …… if something like this happens, there’s a great example of needing to be responsive with a lot more data
Maybe don’t rely on cloud garbage for basic development?
Technically I don’t. I’m also the guy running CI/CD building devContainers for my engineers. They no longer have to worry about updating certificates and tools and versions or security patches, and IT doesn’t have to worry about a lot of crap on their laptops that IT doesn’t manage. Engineers can use a standard laptop install and just get the latest of everything they need, scanned, verified, as soon as it’s available. And since it’s all automated, I can support many variations, and yes they can pull any older version from the repo if they need to, every project can easily be on different versions of different tools and languages
At work, I’m on the same network, but working from home, I still need the responsiveness to do my job
there could be some new thing that no one has not even bothered to think about because of the limitations. Imagine streaming back when downloading few kilobytes for an hours was considered reasonable, people would have laughed at the very thought of it.
We’re not using the bandwidth we have. Many US cities have service with 1Gbps download speed available. I have it for my own reasons. Servers are the bottleneck; they rarely even reach half that speed.
If we’re not using 1Gbps, why should we believe something would pop up if we had 50Gbps?
Now, direct addressing where everyone can be a server and bandwidth utilization is spread more towards the edges of the network? Then you have something that could saturate 1Gbps. But you can’t do that on IPv4.
Unless you’re going to host your own YouTube…
This is exactly what peer tube is struggling with. This bandwidth would solve the video federation problem.
See, you get it!
Except we need IPv6 before that’s at all viable.
We are not even filling out the bandwidth of pipes we have to the home right now. “If you build it, they will come” does not apply when there’s already something there that isn’t being fully utilized.
Oh, maybe. I’m not familiar with bandwidth utilization in China.
How exactly does NAT prevent that? On good hardware it adds insignificant latency.
It has nothing to do with latency, and everything to do with not being able to directly address things behind NAT.
Edit: and please, nobody argue that NAT increases security. That dumbass argument should have died the moment it was first uttered.