> 4k on a lot of streaming services is compressed to hell and back to work with people on slower internet connections.
A 4K UltraHD Bluray (that's 100GB for one movie) has a maximum bitrate of "just" 144Mbps. If you're suggesting online streaming services have some swathe of content that's (checks notes) in excess of 7x the bitrate used for 4K Bluray discs, I'd love to hear about it.
> video game streaming services aren’t a failed experiment
I'd have thought latency was a far bigger concern here, but even if not: it's still just sending you a 4K video stream.. it just happens to be a stream that's reacting to your input.
> A 4K UltraHD Bluray (that's 100GB for one movie) has a maximum bitrate of "just" 144Mbps. If you're suggesting online streaming services have some swathe of content that's (checks notes) in excess of 7x the bitrate used for 4K Bluray discs, I'd love to hear about it.
We are still a long way off the parity with what our eyes can process so there's plenty of room for bitrates to grow.
Plus the average internet connection isn't just streaming a video. It's kids watching online videos while adults are video conferencing and music is being streamed in the background. Probably with games being downloaded and software getting updated too.
A few hundred Mbps here, another few there. Quickly you exceed 1 gigabit.
> I'd have thought latency was a far bigger concern here, but even if not: it's still just sending you a 4K video stream.. it just happens to be a stream that's reacting to your input.
Latency and jitter matter too. But they're not mutually exclusive properties.
Plus if you're streaming VR content then that is multiple 4k streams per device. And that's on top of all the other concurrent network operations (as mentioned above).
You're also still thinking purely about current tech. My point was that developers create new tech to take advantage of higher specs. Its easy to scoff at comments like this but I've seen this happen many times in my lifetime -- the history of tech speaks for itself.
> Plus the average internet connection isn't just streaming a video. It's kids watching online videos while adults are video conferencing and music is being streamed in the background. Probably with games being downloaded and software getting updated too.
That's exactly the scenario I gave where 2.5G WAN would be useful, but a 1G LAN to each machine is likely enough for most tasks, for most people - multiple users simultaneous use.
> That's exactly the scenario I gave where 2.5G WAN would be useful, but a 1G LAN to each machine...
You're moving goal posts now because your original comment, the one that sparked this discussion, neither mentioned 2.5G WAN nor that your 1G comment was specific to each machine rather than internet connectivity as a whole.
> but a 1G LAN to each machine is likely enough for most tasks, for most people - multiple users simultaneous use.
For today, yes. But you're demonstrating a massive failure of imagination by assuming those needs are going to be the same in a few years time. For example, the 4k figures you're quoting are fine and dandy if you don't take into account that TV manufacturers are going to want to sell newer screens. Which means more emphasis on content with high colour colour depths, refresh rates and resolutions. This isn't even a theoretical point, there are already 8k @ 120FPS videos on YouTube.
Heck, I've already hit the 1GbE limit for a few specific bits of hardware in my home set up. Mainly my home server and some "backbone" wiring between two core switches which join two separated buildings on my property. But if I'm hitting that limit today then it's not going to be many more years before other people start hitting it for far less esoteric reasons than mine.
You're also overlooking that fact that if you have router providing GbE to desktops and WiFi 6 to other devices, it's very unlikely to be powerful enough to switch all of those devices at gigabit speeds, let alone routing at 2.5G to the WAN. And that's with regular IPv4 packages, never mind the additional overhead that IPv6 adds. Underpowered consumer networking equipment is already impacting home users right now. So again, we are seeing limits being hit already.
---
Lets also not forget all the other noise being introduced into homes. Smart speakers uploading voice recordings for speech-to-text analysis. Smart doorbells and over security devices uploading video. Smart lights, fridges, plugs, plant pots and whatever else phoning home. Set top TV boxes, and other multimedia devices phoning home, downloading software updates and streaming adverts. In fact have you ever run wireshark on your average home network recently? There is a lot of noise these days and that's only set to grow exponentially.
My original comment, in reply to someone saying "a lot of people already get more than 1G from their ISP" and implying that it's therefore worthwhile to have 2.5GEth on all local devices ends with:
> In this scenario a 2.5G (or 10G) router is all that's really required to get the benefit, while using the existing 20 year old wiring.
I'm sorry if the correlation between having a 2.5G router and having greater than 1G WAN wasn't obvious to you.
Complaining that a quasi backbone link saturates gig eth when my entire point was that single computers are unlikely to need more kind of misses the whole point I was making for an excuse to complain.
I never said no one needs more than gig for anything.
AFAIK we're still very far below the dynamic range human eyes are capable of seeing, so there's plenty of room to need to up the bit depth (and rate) for video if displays can improve. Our color gamuts also do not cover human vision.
So I had to use a calculator to help me here, and I used https://toolstud.io/video/bitrate.php, but apparently the raw bitrate for 4K@25fps/24bit is 4.98Gbps, which then obviously gets compressed by various codecs.
Taking the above 4K@25fps/24bit and pumping it up to 60fps and 36bit colour (i.e. 12 bits per channel, or 68 billion colours, 4096x as many colours as 24bit, and 64x as many colours as 30bit) the resulting raw video bitrate is 17.92Gbps... so it's an increase of <checks notes> about 3.6x.
It seems quite unlikely that we'll have every other aspect of 36bit/60fps video sorted out, but somehow the codecs available have worse performance than is already available today.
My understanding is that today's HDR sensors and displays can do ~13 stops of dynamic range, while humans can see at least ~20, though I'm not sure how to translate that into how much additional bit depth ought to be needed (naively, I might guess at 48 bits being enough).
I don't see why we'd stop at 60fps when 120 or even 240 Hz displays are already available. Also 8k displays already exist. The codecs also have tunable quality, and obviously no one is sending lossless video. So we can always increase the quality level when encoding.
So it's true in 2023 (especially since no one will stream that high of quality to you), but one can easily imagine boring incremental technology improvements that would demand more. There's plenty of room for video quality to increase before we reach the limitations of human eyes.
A 4K UltraHD Bluray (that's 100GB for one movie) has a maximum bitrate of "just" 144Mbps. If you're suggesting online streaming services have some swathe of content that's (checks notes) in excess of 7x the bitrate used for 4K Bluray discs, I'd love to hear about it.
> video game streaming services aren’t a failed experiment
I'd have thought latency was a far bigger concern here, but even if not: it's still just sending you a 4K video stream.. it just happens to be a stream that's reacting to your input.