Back in college we had 100Mbit internet connections in our dorm rooms when most people had 10Mbit cable or DSL at most. At the time it was considered ridiculously fast and certainly not something an average consumer would ever need.
> In 2023, there are very few uses for home users that will exceed what a 1G connection can provide
Video games are getting bigger all the time. The latest Call of Duty apparently is 200GB. On 1gbit you are limited to 125MB/s downloads (assuming zero overhead) that's almost a half an hour to download. PCIe4 SSD's are capable of write speeds of about 7GB/s and PCIe5 SSD's are just hitting the market with even faster speeds. At 10Gbit you can download that game in less than 3 minutes. In neither case are you even approaching the speed at which your PC can store that data.
When PCIe5 SSD's go mainstream a home PC user would even be able to saturate a 100Gbit connection.
Please, I implore you, to read the line you quoted again, and then perhaps pull out the old Oxford English, and look up what "very few" means.
I'll be generous and give you a hint: it doesn't mean none.
But your example also has great relevance to the "familiar" sentence in my original comment which was:
> 2.5G is rapidly approaching, if not already past the point, for a lot of people where a single machine will never use all of that capacity, and the advantage of higher total bandwidth is to support multiple people doing high bandwidth tasks.
I italicised the part that I knew people would somehow ignore in my original comment and I've done it again, because obviously once wasn't enough.
Here, let me pull out the important words yet again just to make it really clear:
Not the commenter but I’ve heard people make statements like that time and time again, only for those limits to be obliterated a few years later.
The thing is, the moment a new upper bound becomes available, developers find a way to use it. It’s like the freeway problem that adding more roads ironically adds to congestion.
Take storage, the greater the storage capacity of media increased, the larger game assets became. The faster CPUs and system memory became, the heavier our operating systems and desktop software became.
Likewise, the faster our internet becomes, the more dependent we will become on streaming high fidelity content. 4k on a lot of streaming services is compressed to hell and back to work with people on slower internet connections. And much as Google Stadia was shutdown, video game streaming services aren’t a failed experiment. Plus even with more traditional services, how many of us roll our eyes at multi-hour download times for new games?
Once gigabit internet becomes the norm (it’s common place in a lot of countries already, but it’s not quite the norm yet) then you’ll see more and more services upscale to support it, and thus others on the cutting edge of the tech curve finding that gigabit internet isn’t quite fast enough any more. And that will happen sooner than you think.
> 4k on a lot of streaming services is compressed to hell and back to work with people on slower internet connections.
A 4K UltraHD Bluray (that's 100GB for one movie) has a maximum bitrate of "just" 144Mbps. If you're suggesting online streaming services have some swathe of content that's (checks notes) in excess of 7x the bitrate used for 4K Bluray discs, I'd love to hear about it.
> video game streaming services aren’t a failed experiment
I'd have thought latency was a far bigger concern here, but even if not: it's still just sending you a 4K video stream.. it just happens to be a stream that's reacting to your input.
> A 4K UltraHD Bluray (that's 100GB for one movie) has a maximum bitrate of "just" 144Mbps. If you're suggesting online streaming services have some swathe of content that's (checks notes) in excess of 7x the bitrate used for 4K Bluray discs, I'd love to hear about it.
We are still a long way off the parity with what our eyes can process so there's plenty of room for bitrates to grow.
Plus the average internet connection isn't just streaming a video. It's kids watching online videos while adults are video conferencing and music is being streamed in the background. Probably with games being downloaded and software getting updated too.
A few hundred Mbps here, another few there. Quickly you exceed 1 gigabit.
> I'd have thought latency was a far bigger concern here, but even if not: it's still just sending you a 4K video stream.. it just happens to be a stream that's reacting to your input.
Latency and jitter matter too. But they're not mutually exclusive properties.
Plus if you're streaming VR content then that is multiple 4k streams per device. And that's on top of all the other concurrent network operations (as mentioned above).
You're also still thinking purely about current tech. My point was that developers create new tech to take advantage of higher specs. Its easy to scoff at comments like this but I've seen this happen many times in my lifetime -- the history of tech speaks for itself.
> Plus the average internet connection isn't just streaming a video. It's kids watching online videos while adults are video conferencing and music is being streamed in the background. Probably with games being downloaded and software getting updated too.
That's exactly the scenario I gave where 2.5G WAN would be useful, but a 1G LAN to each machine is likely enough for most tasks, for most people - multiple users simultaneous use.
> That's exactly the scenario I gave where 2.5G WAN would be useful, but a 1G LAN to each machine...
You're moving goal posts now because your original comment, the one that sparked this discussion, neither mentioned 2.5G WAN nor that your 1G comment was specific to each machine rather than internet connectivity as a whole.
> but a 1G LAN to each machine is likely enough for most tasks, for most people - multiple users simultaneous use.
For today, yes. But you're demonstrating a massive failure of imagination by assuming those needs are going to be the same in a few years time. For example, the 4k figures you're quoting are fine and dandy if you don't take into account that TV manufacturers are going to want to sell newer screens. Which means more emphasis on content with high colour colour depths, refresh rates and resolutions. This isn't even a theoretical point, there are already 8k @ 120FPS videos on YouTube.
Heck, I've already hit the 1GbE limit for a few specific bits of hardware in my home set up. Mainly my home server and some "backbone" wiring between two core switches which join two separated buildings on my property. But if I'm hitting that limit today then it's not going to be many more years before other people start hitting it for far less esoteric reasons than mine.
You're also overlooking that fact that if you have router providing GbE to desktops and WiFi 6 to other devices, it's very unlikely to be powerful enough to switch all of those devices at gigabit speeds, let alone routing at 2.5G to the WAN. And that's with regular IPv4 packages, never mind the additional overhead that IPv6 adds. Underpowered consumer networking equipment is already impacting home users right now. So again, we are seeing limits being hit already.
---
Lets also not forget all the other noise being introduced into homes. Smart speakers uploading voice recordings for speech-to-text analysis. Smart doorbells and over security devices uploading video. Smart lights, fridges, plugs, plant pots and whatever else phoning home. Set top TV boxes, and other multimedia devices phoning home, downloading software updates and streaming adverts. In fact have you ever run wireshark on your average home network recently? There is a lot of noise these days and that's only set to grow exponentially.
My original comment, in reply to someone saying "a lot of people already get more than 1G from their ISP" and implying that it's therefore worthwhile to have 2.5GEth on all local devices ends with:
> In this scenario a 2.5G (or 10G) router is all that's really required to get the benefit, while using the existing 20 year old wiring.
I'm sorry if the correlation between having a 2.5G router and having greater than 1G WAN wasn't obvious to you.
Complaining that a quasi backbone link saturates gig eth when my entire point was that single computers are unlikely to need more kind of misses the whole point I was making for an excuse to complain.
I never said no one needs more than gig for anything.
AFAIK we're still very far below the dynamic range human eyes are capable of seeing, so there's plenty of room to need to up the bit depth (and rate) for video if displays can improve. Our color gamuts also do not cover human vision.
So I had to use a calculator to help me here, and I used https://toolstud.io/video/bitrate.php, but apparently the raw bitrate for 4K@25fps/24bit is 4.98Gbps, which then obviously gets compressed by various codecs.
Taking the above 4K@25fps/24bit and pumping it up to 60fps and 36bit colour (i.e. 12 bits per channel, or 68 billion colours, 4096x as many colours as 24bit, and 64x as many colours as 30bit) the resulting raw video bitrate is 17.92Gbps... so it's an increase of <checks notes> about 3.6x.
It seems quite unlikely that we'll have every other aspect of 36bit/60fps video sorted out, but somehow the codecs available have worse performance than is already available today.
My understanding is that today's HDR sensors and displays can do ~13 stops of dynamic range, while humans can see at least ~20, though I'm not sure how to translate that into how much additional bit depth ought to be needed (naively, I might guess at 48 bits being enough).
I don't see why we'd stop at 60fps when 120 or even 240 Hz displays are already available. Also 8k displays already exist. The codecs also have tunable quality, and obviously no one is sending lossless video. So we can always increase the quality level when encoding.
So it's true in 2023 (especially since no one will stream that high of quality to you), but one can easily imagine boring incremental technology improvements that would demand more. There's plenty of room for video quality to increase before we reach the limitations of human eyes.
I guess I shouldn't be surprised that someone who believes the debunked Gates quote is real, also can't comprehend the difference between "for anyone" and "for a lot of people".
Now where have I heard that before...