Noted that this would be a non issue if you could just connect a cable between your Mac and the Vision. You already have the huge battery pack dangling off the side. This is an apple issue, not a technical one.
Do the math of how much bandwidth like three windows/screens (because with this model each window is basically its own screen) at 4K/100hz/10bit color each would take.
You're at limits of TB4 _very quickly_.
You can compress the image, try to do something smart with foveated rendering (only stream the windows that users are looking at; but that breaks if you want to keep a window with logs in your peripheral vision), use chroma subsampling, etc; but those all are varying trade-offs with relation to image quality.
You don't need to send every pixel for every frame uncompressed.
It would be more like VLC, not sending pixels that aren't changing. And you don't really need 100Hz either. You can't read that fast anyway, the content could refresh lower than that as long as the actual window is moved at 100Hz to avoid nausea.
I really doubt the Mac display functionality as-is is refreshed at 100Hz with full pixel fidelity without compression. WiFi can't handle those kinds of speeds reliably.
Foveated rendering doesn’t stop streaming if the user isn’t looking at the window. Just streams in lower quality.
If you would just trivially stream all windows to the vision of course it will at some point be at the limit of current technology. But I would assume a company like apple has the means to push the state of transmission media (rather than just using a now 4 year old standard) and be able to think of something smarter than „just stream all windows in maximum quality even if the user doesn’t look at them“
I may be wrong but it honestly doesn’t seem realistic to expect seamless, latency free foveated rendering switching at those speeds while coordinating between two devices.
Since foveated rendering would only send the resolution required for what the user could perceive then even logs in the peripheral space would be ok since they would be sent in much lower resolution. I think the challenge with some smart foveated rendering would likely be latency.
Another option would be handling rendering on the Vision Pro rather than the MacBook so pixels don't need to be streamed at all.
Exactly. Current systems that work with large channel counts of high res deep colour real-time compositing are also around 6U 19" rack mount units and pull half a kilowatt of power. Not exactly ergonomic for strapping to one's face.
It'd break if you wanted to do the dumbest thing and just not stream the windows users aren't looking at; but lowering the stream resolution on the fly could work but now involves more complexity (on both sides to communicate when to adjust the resolution) and because it's not handled entirely on-device breaks the illusion of it being invisible.
I can also imagine it having some weird privacy implications; like Mac apps somehow monitoring this and tracking whether you're actually looking at them, etc.
If you're OK with rendering everything at high resolution (and then choosing what quality to send over) then you shouldn't have any privacy issues, assuming that this part is done by the OS.
It’s the product vision. But progress is slow reportedly because of physical and current scientific tech limits. Their product vision could take another 10-30 years.
No, it’s a processor and graphics card issue as well. You have to actually render a potentially unlimited number of 4K screens (at least the ones in your direct view) on the Mac Pro. It was never built to do that.