> Also, a TV can be watched by multiple people, and a home cinema will obviously deliver better sound.
I was extremely surprised that shared reality was completely absent from the presentation. Apparently the sensors on these devices don't enable creating a coordinate system that multiple devices can collaborate on/in. You can't look at the same objects in space together.
This is hard stuff, but I'm stunned they're shipping it before solving that problem.
You’d think it would because of that but you’d also think if it was supported they’d mention it, if only to provide defense against the “only for friendless nerds that live alone” criticism, albeit at an absurd price point.
Got your point. I only saw it from a very limited technical understanding of how ARKit works and how shared experiences can be achieved on a framework level.
Not explicitly mentioning shared experiences other than video calls at all could also indicate this is not the way it should be framed (by focusing on the collaborative aspects that exist "today").
The price sure point prevents me and my family members from casually trying this experience.
I was merely suggesting that the technology for shared experiences already exists in the form of shared anchors.
You could be right wrt "if they didn’t mention it explicitly it's not part of their (currently) intended experience", but it might as well be due to "spatial computing" being sth that primarily will be shaped by their adopters along the way, which is something different than a corporation plotting the experience up front (as might be the case with metaverse?).
We're really missing the point here. Yes, that device can't do better. You can't do that with your friends together. But there's some similar in America right now. You can just jump in and in 2 or 4 minutes, and that kind of experience. Why to bother this kind of thing, this new tech? Doing things together is good, I think that's the main selling point of these devices.
> I'm stunned they're shipping it before solving that problem
Are you really? Outside of everybody sitting on the couch watching a movie together, which will be an extremely marginal use case for this thing anyway—are you seriously going to buy all of (spouse, kids, friends) their own $3500 headset?—shared-reality seems very niche for consumer applications, which are clearly what they're targeting.
I think without shared reality in place, the public verdict on this device will be that it's a loneliness enabler, or has you wear your loneliness on your face. Or rather, on a screen strapped to your face. It's going to be undesirable, the most damning quality of any consumer item. Nobody will envy their peers for having one.
People say this about smartphones, too, and yet adoption is practically universal. If the product is worth using, people will use it, and the social friction will fade. The reason products like Google Glass never moved beyond pariah status is that they weren't really worth using, so they were only ever used by "tech bros" who were already cultural pariahs, and who in so using outed themselves as such.
Besides which, nobody is looking into my home and calling my various screens "loneliness enablers". Not that I would give a shit if they were, though I might invest in some blinds or drapes.
> The glasses are extreme expensive and they are not replacing anything.
Well, they're essentially pitched as a replacement for laptops, tablets, and for some users TVs too. No product category goes from zero to full adoption in a day (look how long it took for laptops!) but saying this headset isn't pitched as a computer replacement is flat out wrong.
> Someone with an ipad still needs the glasses and the other way around.
Why? You're losing the drawing tablet functionality, which I assume most iPad owners don't use, and what else?
> I'm saying the "Apple goggles" can so easily fall victim this perception because those qualities are so front-and-center with it.
And I'm saying nobody will ultimately give a crap if the tech works as well as Apple wants it to. Our social spaces have been utterly transformed by screens and networked technology in the last few decades, and while there is always some pushback, progress marches on for better or worse.
Especially with as much emphasis they put on SharePlay in the iPhone presentation. Quite a neat feature. For the few households that will splurge $14,000 for a family of 4 to watch movies together once a month, I'd hope it would have this feature!
It's not the sensors. Meta headsets can do this with much worse sensors by using shared anchors, which as someone else mentioned is already a feature in ARKit. Why they didn't mention this or integrate it into the OS I don't know.
I was extremely surprised that shared reality was completely absent from the presentation. Apparently the sensors on these devices don't enable creating a coordinate system that multiple devices can collaborate on/in. You can't look at the same objects in space together.
This is hard stuff, but I'm stunned they're shipping it before solving that problem.