Hacker News new | past | comments | ask | show | jobs | submit login

> the technical VR "stuff that works" is basically written by Qualcomm.

I worked at Oculus on the product that became Quest from 2014-2019 and this statement could not be more wrong. You need more than a vulkan driver to make a VR headset, e.g:

  * lenses that minimize distortion and chromatic aberration, that are lightweight and compact
  * astounding amounts of work on inside-out tracking to make it work within latency and power budgets with sufficient accuracy
  * latency reduction and prediction throughout the stack
  * foveated rendering
  * supporting all of those things in Unity and Unreal
And for every one of those there are 10 more that I'm forgetting.

> The user side Android system is, really badly done and irritating for people using it as people like Carmack have pointed out.

The user shell side we have today is entirely his creation, as a response to the less-than-stellar behavior of the Unity shell in the first release. I'm not sure Android is very material for the day-to-day user experience; it's just a particularly opinionated version of linux.

There are plenty of missteps here, but none of them are things you've described.




The hand tracking has seen continuous improvement and the depth correct passthrough is pretty good and I'm sure quite a feat to accomplish.

Its worth noting that much of that 10 Billion wasn't just spent on Horizon Worlds as critics off-handedly suggest, but has been sunk in to lots of acquisitions (hardware and software companies) and research, much of which has yet to be seen in released products.

I'm very impressed with the ambition of what they are trying to do and their committment to a longer term vision which is something that often is lacking in many companies. As an example Nvidia were working on AI tooling and hardware for a decade before it really started to pay off big.


Just bought Meta's latest smart glasses and i use them almost daily because for taking pics and short videos they were almost perfectly and like most people I wear sunglasses almost daily.

Im not sure they will win this race, but the glasses are solid for the two instances i mentioned. Other features aren't as solid like Hey Meta barely works.


> lenses that minimize distortion and chromatic aberration, that are lightweight and compact

That was Corning, not Meta.

> astounding amounts of work on inside-out tracking to make it work within latency and power budgets with sufficient accuracy

That was Valve, not Meta.

> latency reduction and prediction throughout the stack

That was Carmack, so technically Meta, but it was also 6+ years ago.

> foveated rendering

That was a waste of time.

> supporting all of those things in Unity and Unreal

I'm sorry, I haven't seen the Unreal code, but based on what's in the Unity code, this being a "win" is laughable and maybe signals the low expectations you have to also think that Android hasn't had a material impact on day-to-day dev experience.


> That was Corning, not Meta.

I don't know who manufactures them, but Meta has had a team of optics engineers working on new lenses for years.

> That was Carmack, so technically Meta, but it was also 6+ years ago.

Timewarp was one of many improvements, done by many different people over the years.

> That was Valve, not Meta.

Are you referring to the classic Valve room demo which used fiducials plastered on the wall for tracking, or the Valve lighthouse which projects lasers on the walls for tracking?

Whether you put fiducials on the wall and a camera on the headset, or vice-versa (like the original Rift did) is not that relevant of a distinction. Whatever "inside out tracking" meant in 2012, today it means you get 6dof without having to setup a pile of gear in your room.


> > astounding amounts of work on inside-out tracking to make it work within latency and power budgets with sufficient accuracy

> That was Valve, not Meta.

When did Valve make a headset with camera based tracking instead of setting up lighthouses all around the room? Their tracking that I'm aware of is all about the timing of light flashes sweeping across light sensors, it would be totally useless on Meta's self-contained headsets.

> > foveated rendering

> That was a waste of time.

If you're trying to sustain non-barf-inducing framerates on a mobile processor this seems useful to me?


Have you ever tried foveated rendering? I find it to be significantly more uncomfortable than low framerates.


I have a Quest 1, so yes, games that use it look terrible outside of the center area (including text being unreadable). If you want to look at something you have to turn your head and face toward it, since there's no eye tracking to move the high detail area around. It's not great.

But the alternative is the games wouldn't sustain acceptable framerates, so I'll take that tradeoff. I'm not saying it looks good, but it's running on a Snapdragon 835 so you do what you have to.


I worked at Oculus on every headset from Rift DK1 to Quest 2 from 2012-2019 and this statement could not be more wrong.

> That was Corning, not Meta.

You need more than a glass maker to build novel optical designs, besides which, none of the headsets at least within that time window used any material from Corning. The lenses in DK1 and DK2 were essentially straight copies of an off the shelf magnifier. The first consumer Rift used a Fresnel lens design that was re-optimized internally off of a design that started at Valve. The design was just the tip of the iceberg though. A colossal amount of engineering work went into manufacturability, especially around coatings. Rift S, Quest, and Quest 2 all used an evolution of that design that took in further advancements done internally within both the product and the research teams. One general note on all of this is that "lens design" for isn't a matter of drawing some curves and throwing it out to a factory.


Meta does a ton of ongoing work to optimize the stack in the headset. I was just reading about Application SpaceWarp (https://developer.oculus.com/blog/introducing-application-sp...) earlier today and it's impressive.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: