Hacker News new | past | comments | ask | show | jobs | submit login

Not really?

Do the math of how much bandwidth like three windows/screens (because with this model each window is basically its own screen) at 4K/100hz/10bit color each would take.

You're at limits of TB4 _very quickly_.

You can compress the image, try to do something smart with foveated rendering (only stream the windows that users are looking at; but that breaks if you want to keep a window with logs in your peripheral vision), use chroma subsampling, etc; but those all are varying trade-offs with relation to image quality.




> You're at limits of TB4 _very quickly_.

You don't need to send every pixel for every frame uncompressed.

It would be more like VLC, not sending pixels that aren't changing. And you don't really need 100Hz either. You can't read that fast anyway, the content could refresh lower than that as long as the actual window is moved at 100Hz to avoid nausea.

I really doubt the Mac display functionality as-is is refreshed at 100Hz with full pixel fidelity without compression. WiFi can't handle those kinds of speeds reliably.


Foveated rendering doesn’t stop streaming if the user isn’t looking at the window. Just streams in lower quality.

If you would just trivially stream all windows to the vision of course it will at some point be at the limit of current technology. But I would assume a company like apple has the means to push the state of transmission media (rather than just using a now 4 year old standard) and be able to think of something smarter than „just stream all windows in maximum quality even if the user doesn’t look at them“


I may be wrong but it honestly doesn’t seem realistic to expect seamless, latency free foveated rendering switching at those speeds while coordinating between two devices.


Since foveated rendering would only send the resolution required for what the user could perceive then even logs in the peripheral space would be ok since they would be sent in much lower resolution. I think the challenge with some smart foveated rendering would likely be latency.

Another option would be handling rendering on the Vision Pro rather than the MacBook so pixels don't need to be streamed at all.


Exactly. Current systems that work with large channel counts of high res deep colour real-time compositing are also around 6U 19" rack mount units and pull half a kilowatt of power. Not exactly ergonomic for strapping to one's face.


why would foveated rendering break down? it does not stop rendering where you are not looking just lowers the resolution


It'd break if you wanted to do the dumbest thing and just not stream the windows users aren't looking at; but lowering the stream resolution on the fly could work but now involves more complexity (on both sides to communicate when to adjust the resolution) and because it's not handled entirely on-device breaks the illusion of it being invisible.

I can also imagine it having some weird privacy implications; like Mac apps somehow monitoring this and tracking whether you're actually looking at them, etc.


If you're OK with rendering everything at high resolution (and then choosing what quality to send over) then you shouldn't have any privacy issues, assuming that this part is done by the OS.


Foveated rendering don't work but not the way described in GP; human eyes are too fast against current latency and frame intervals for it to work.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: