> For personal/entertainment use it completely replaces the need for a TV, soundbar, or home cinema.
You're making a bold assumption, that someone wants to wear this headset when relaxing. Also, a TV can be watched by multiple people, and a home cinema will obviously deliver better sound.
> For business use, the days of multiple displays and screen management seem set to be a relic of the past. I look forward to coding in an IDE which isn't constrained to a physical device sat on my desk, or replying to emails "on the beach" versus under fluorescent lighting.
The IDE is an interesting perspective I too as a developer am thinking about. But there's a reason you can be as productive in 2023 as in 1983 using emacs or vim. Because it's insanely hard to replace the simplicity of text buffers and a keyboard.
> In response to the obvious criticisms (high price, battery life, form factor)... this is Gen 1. Look how quickly the iPhone and Apple Watch evolved between generations 1 to 3, and look how the price changed as production capabilities and economies of scale evolved.
Smart watches have been anything but a groundbreaking technological revolution.
> Also, a TV can be watched by multiple people, and a home cinema will obviously deliver better sound.
I was extremely surprised that shared reality was completely absent from the presentation. Apparently the sensors on these devices don't enable creating a coordinate system that multiple devices can collaborate on/in. You can't look at the same objects in space together.
This is hard stuff, but I'm stunned they're shipping it before solving that problem.
You’d think it would because of that but you’d also think if it was supported they’d mention it, if only to provide defense against the “only for friendless nerds that live alone” criticism, albeit at an absurd price point.
Got your point. I only saw it from a very limited technical understanding of how ARKit works and how shared experiences can be achieved on a framework level.
Not explicitly mentioning shared experiences other than video calls at all could also indicate this is not the way it should be framed (by focusing on the collaborative aspects that exist "today").
The price sure point prevents me and my family members from casually trying this experience.
I was merely suggesting that the technology for shared experiences already exists in the form of shared anchors.
You could be right wrt "if they didn’t mention it explicitly it's not part of their (currently) intended experience", but it might as well be due to "spatial computing" being sth that primarily will be shaped by their adopters along the way, which is something different than a corporation plotting the experience up front (as might be the case with metaverse?).
We're really missing the point here. Yes, that device can't do better. You can't do that with your friends together. But there's some similar in America right now. You can just jump in and in 2 or 4 minutes, and that kind of experience. Why to bother this kind of thing, this new tech? Doing things together is good, I think that's the main selling point of these devices.
> I'm stunned they're shipping it before solving that problem
Are you really? Outside of everybody sitting on the couch watching a movie together, which will be an extremely marginal use case for this thing anyway—are you seriously going to buy all of (spouse, kids, friends) their own $3500 headset?—shared-reality seems very niche for consumer applications, which are clearly what they're targeting.
I think without shared reality in place, the public verdict on this device will be that it's a loneliness enabler, or has you wear your loneliness on your face. Or rather, on a screen strapped to your face. It's going to be undesirable, the most damning quality of any consumer item. Nobody will envy their peers for having one.
People say this about smartphones, too, and yet adoption is practically universal. If the product is worth using, people will use it, and the social friction will fade. The reason products like Google Glass never moved beyond pariah status is that they weren't really worth using, so they were only ever used by "tech bros" who were already cultural pariahs, and who in so using outed themselves as such.
Besides which, nobody is looking into my home and calling my various screens "loneliness enablers". Not that I would give a shit if they were, though I might invest in some blinds or drapes.
> The glasses are extreme expensive and they are not replacing anything.
Well, they're essentially pitched as a replacement for laptops, tablets, and for some users TVs too. No product category goes from zero to full adoption in a day (look how long it took for laptops!) but saying this headset isn't pitched as a computer replacement is flat out wrong.
> Someone with an ipad still needs the glasses and the other way around.
Why? You're losing the drawing tablet functionality, which I assume most iPad owners don't use, and what else?
> I'm saying the "Apple goggles" can so easily fall victim this perception because those qualities are so front-and-center with it.
And I'm saying nobody will ultimately give a crap if the tech works as well as Apple wants it to. Our social spaces have been utterly transformed by screens and networked technology in the last few decades, and while there is always some pushback, progress marches on for better or worse.
Especially with as much emphasis they put on SharePlay in the iPhone presentation. Quite a neat feature. For the few households that will splurge $14,000 for a family of 4 to watch movies together once a month, I'd hope it would have this feature!
It's not the sensors. Meta headsets can do this with much worse sensors by using shared anchors, which as someone else mentioned is already a feature in ARKit. Why they didn't mention this or integrate it into the OS I don't know.
> ... Also, a TV can be watched by multiple people ...
I will point to this article: "Why Americans are lonelier and its effects on our health" [1] that claims that "some surveys reveal that around 60 percent of people in the U.S. right now report feeling lonely on a pretty regular basis. And that's pretty devastating from a public health perspective".
I don't know the real number but it connect with the market potential. Also, Apple is really great on hitting the mark. Playing with words, I don't think Mark is as good as Apple.
But isn't this VR/AR kit going to contribute even more to the loneliness - contribute and feed itself on the trend?
Despite positive upbeat music it was kind of sad to watch people alone in their sparsely furnished environments without personal touch, viewing favorite pictures on helmet instead of printed on the wall, father with the face hidden behind this helmet during kid birthday party, etc
Probably yes, we were talking about the success of the product not the society. That is another topic where we can also include mobile phones, streaming services, etc.
> a home cinema will obviously deliver better sound.
Will it, though? Of course you could build a home theater with better sound, but I'd bet that the spatial audio built into AirPods delivers better sound that most peoples' home theater setups (which is generally just a TV with built in sound or a mediocre soundbar).
I don’t think we are at a point in human technology where any noise-cancelling headphones sound better than cheap wired counterparts… they feel amazing by rather deceptive engineering, but it only lasts until you go back to standard non-cancelling speakers.
Except those people who have "mediocre soundbars" can't afford buying a $3.5K VR headset. And those who have a spare budget of $3.5K to enhance their TV watching experience will invest the money on a better TV and a surround speaker setup.
I want an IMAX viewing experience with booming surround effects and I only have time to watch when the kids are all in bed. Compared with a house large enough to have a dedicated sound-proof theatre room, $3500 doesn’t sound too expensive.
Also not everyone lives in a house, I’m sure Manhattan condo owners can afford the price of the theatre gear, but cannot afford the space required for them to be used optimally. Wealthy people don’t all live in mansions.
> Except those people who have "mediocre soundbars" can't afford buying a $3.5K VR headset.
There are many many people for whom money is not the limiting factor. It's because they don't have the space, the technical wherewithal to set it up, the motivation to make it happen, or some combination of all three.
I think you might be over-extrapolating your own POV. I have a mediocre soundbar and can afford the Vision Pro. I'm not likely to upgrade my soundbar anytime soon (I don't really care), but I'm very likely to buy a Vision Pro.
can't imagine people will only be buying it for movie watching.
Like I said, "you could build a home theater with better sound." But most people haven't and won't. It requires time, technical expertise, and a lot of space. And with that you only get a home theater. And you can't travel with it (I love the idea of using one on a plane).
But I'm sympathetic to the social-watching issue. I don't love the idea of watching movies in a headset while my wife sits next to me on the sofa doing something else (or even watching the same movie on a screen). But I also don't love the idea of buying two. (And that's without even thinking about larger families.)
I think home theater will be a big part of the appeal, but it won't succeed if replacing a home theater is the only thing it does well.
There's also a physical limit on bass sound from small head speakers. Much of bass sound is felt in the chest as much as in the ear, and the little speakers on the device are limited there. iPods are the same of course, and they do okay with sound, but we accept a lot of limitations on portable devices.
Note that 65" is very different from 100" and most people's movie experience at home is far too small relative to the directors' intents.
The Sonos w/ sub + rear surrounds and an 85" OLED TV with these latencies will put you in the price point of this thing.
If you're apart, both people would need a room, TV, and Sonos system to share the experience. So each has that "need one per person" problem depending whether colocated or not.
I'm more excited to use this for games and VRChat. My Valve Index needs my whole gaming desktop to power it which is actually pretty much the price of an Apple Vision Pro.
Home cinema implies (to me anyway) a true surround setup and not some crappy soundbar.
You can’t replicate true surround sound with stereo headphones. You can with binaural audio but that requires specialized recordings. I’m sure spatial audio sounds cool but it’s not true surround sound.
And then there’s the problem of low frequencies. You can’t beat a subwoofer.
The average American or British living room with even a cheap surround system is going to run rings around anything in-ear or on-ear.
There's many reasons that people warn newbies not to mix or master on cans and to use speakers.
For people who care about going beyond stereo, budget is going to be a much larger problem for most folks than space or technical knowhow. And anybody who cares about going beyond stereo probably cares about quality.
> You're making a bold assumption, that someone wants to wear this headset when relaxing.
Hey - the world's biggest computer company just went up on stage along with the director of the world's biggest entertainment conglomerate and made that 'bold assumption'. They're probably pretty careful about these sorts of things.
It takes very little for Bob Iger to say he will make Disney+ available on the Vision Pro. It takes very little to deliver a streaming platform to a new device in general, but even less for one that uses the same frameworks as one of your primary existing devices. Most of what they showed was just showing you Disney+ content on a floating screen. I highly doubt they have invested that much into any sort of experience that is only possible on the Vision Pro (hence limiting anything that came close to that as a generic vaporware "What if?" trailer at the end).
With respect to the CEO of that company, I mean, sure. But you kind of take that as a given. It's not like he's only been right, and certainly his leadership so far has been business oriented, vs. "wave of the future" oriented. A good example is how the AirPods ended up being an arguably bigger success than the Watch (and how that hasn't really been fully capitalized on). The good news is that the world's biggest company is precisely the kind of place that can afford to iterate on something like this in the public. So if the theory is that the "dream" of AR is only possible by getting stuff out there to iterate on, then they certainly now have a good shot.
Hey, that’s a great point. Mine was more along the lines that they probably did not go up there on mere assumptions about what users might want. They’re not kids, they’re professionals on the tail-end and apex point of their career. Probably had an army of people do the homework to make sure that they don’t end up looking like complete fools a few years down the line.
Hasn’t there also been some executive overlap on the board level of these companies for a long time?
I was just watching a Steve Jobs keynote from 1998 the other day. And you can see exactly the same strategies implemented there, only with profit margins at <$100m and an inverse David-Goliath relationships with delegates from industry partners.
There are MANY examples of The Walt Disney Company making poor decisions. The most recent one would have to be the "Star Wars Hotel" that cost $1,200/night PER PERSON. In what world can enough Americans afford to fill up a hotel every night at that price? They did what all companies do- they got greedy. Now they have a $300M write-off as they tear it down.
DIS stock is taking a dump right now because Disney+, it turns out, isn't the savior we all thought it was (and were led to believe it was) during the pandemic when the Parks division wasn't bringing in the cash. ESPN is dead weight. They have more debt than ever thanks to the pandemic.
I did not say they’re making the right decision. I merely pointed out that the parent comment poked at this being a bold assumption.
They might be wrong, they might be making a bad play. But they’ve also probably devoted a reasonable amount of resources at finding answers to questions like whether people will want to use these or not. So, they probably didn’t make “bold assumptions”.
It's a question of semantics I suppose. So to me, it was a bold assumption on Disney's part to assume, regardless of what the data/research/surveys told them, that A) there were enough people on the planet who would travel to Orlando, FL to stay at this Star Wars-themed resort for over $1000/person/night, and B) there were enough people on the planet who would travel to Orlando, FL again and again to stay at this Star Wars-themed resort for over $1000/person/night.
It would have surprised me if such people would be interested enough in the hotel to not stay at it in the first few months of its opening, but hold off until some time thereafter, and so in that respect, Disney seems to have recognized that once that initial high demand drops off, it's GG. Demand wouldn't magically (ha) go from ~50% occupancy to ~90-100% with no change to the hotel or the pricing (e.g. any factor external to the resort itself).
>Apple has produced plenty of devices that didn't pan out.
Have they? They've launched particular versions of existing products that didn't sell too well, but have they ever launched a device which was was fundamentally new and didn't eventually sell a ton of units?
To be fair, that might as well be ancient history. I imagine more than 99% of Apple’s value comes from post-iPod activity, and I can’t think of anything post–OS X that failed, let alone post iPod.
I don’t get this HomePod hate. I have the big (old gen) and small one, and they rock.
We’re in allocated housing at the moment for my fiancées work and they have a shitty tv. I wouldn’t be able to hear the thing if it wasn’t playing through my HomePod. And that’s before the benefits of it as a speaker
Pure anecdata of course and I'm far from the average person, but I personally do not like to wear headphones if I can avoid it. (And yes, I have good headphones.) I can't even imagine having a screen strapped to my face.
I can imagine having a screen strapped to my face. I can't imagine a killer app that makes it worth the trouble & cost. I was hoping apple could help me out with my limited imagination, but they're pushing Apple Vision for watching movies, surfing the internet and facetime, so not really.
> You're making a bold assumption, that someone wants to wear this headset when relaxing.
Raises hand. I'm in for that. I'm a VR fan but my soapbox has always been that AR is the true future.
> Also, a TV can be watched by multiple people, and a home cinema will obviously deliver better sound.
In the same way that when the iPhone came out there were individual devices that could do each feature better than the iPhone could, yes :)
If you look at video consumption, "individual" devices (phones, tablets, laptops) make up about 50% of viewing time. TV the rest. I don't think the multiple people angle is going to kill this considering how much content is consumed individually already.
A home cinema also has to be researched, purchased separately, takes up space, etc. Any pair of $200 Bose headphones sounds better than the old iPhone ear pods...and yet...
> it's insanely hard to replace the simplicity of text buffers and a keyboard.
We're talking about replacing monitors, not text buffers or keyboards!
> it's insanely hard to replace the simplicity of text buffers and a keyboard
Totally agree. I just want to use this to replace my big bulky monitor that I can't take with me wherever I go and that makes my small place look a little more junky.
> You're making a bold assumption, that someone wants to wear this headset when relaxing.
Well Apple showed someone using the headset while lying down in bed. I’d say that Apple is making a bold statement about the comfort of their product. We’ll need to wait for hands on reviews to determine if it indeed is as comfortable as Apple implies it is.
> Well Apple showed someone using the headset while lying down in bed.
The box of Wheaties showed someone shooting the winning buzzer beating home run touch down in double overtime to win the world series of superbowl cups. Somehow I doubt that, due to the bowl of Wheaties I had for breakfast, my afternoon will look much like that.
> You're making a bold assumption, that someone wants to wear this headset when relaxing.
I've seen people pass out in VRChat with their headsets on. Some people on VRC are on there for 12+ hours a day. It's a fascinating sub culture I was totally shocked to learn about. People drink and do drugs while listening to music at a virtual rave. Multiple rooms full every Friday, Saturday 80-120 people in the room hanging out. I found myself up till 6 am lost in the music.
I used to do exactly that when the pandemic started, but like, is that enough for a $3k headset? I paid $2k for my setup but I was already heavily committed to various simulation game genres.
120 people is not enough for an entire headset division, and since the main reason most people don't like doing that is that they don't really enjoy having the headset on, I don't know what apple can do to change that.
By Gen3 in 5 years, people will begin to buy these like iPhones and then multiple people will be able to watch via SharePlay.
3D images are probably coming to iPhone 15 or iPhone 16 so the posts about "who's gonna wear this to take pictures" are already moot.
This is a developer/enthusiast focused niche release providing perfect beta testing grounds while technology will shrink this device to a smaller and more practical form factor.
In 3-4 years, the current Vision Pro will be the standard Apple Vision product with a smaller form factor while a new Pro product will have more advanced features and lose its external battery.
I also think that Apple Vision will be successful but Gen1 is not where its at for the vast majority of users.
Using a tiny battery pack. I'm assuming (hoping) that you'll be able to use something bigger that provides USB-C power and get a correspondingly longer life.
good catch about needing 4 of them for a family... will they implement multiuser in vision OS or you will have to buy one for each member as you are supposed to do with the ipads?
I have to wonder about a device that won't allow you to watch a 2-hour movie without running out of battery and having to plug it in. I guess Oppenheimer is a no-go.
Give it a decade, and the demand for immersive escapism will be greater than ever, if anyone can afford it, as western civilization continues its decline/collapse.
> Smart watches have been anything but a groundbreaking technological revolution.
I think Apple has made a tactical error here. The days of the shrinking iPhone are long gone, but not forgotten. It was the iPhone 3G that was a turning point for people who hadn't bought an iPhone yet. It was smaller with better battery life.
If the Apple Watch 3 had followed a similar pattern, they would have had to skip adding the next additional sensor to the device, but I think in the long term that would have just delayed us one design cycle but still given us a thinner and lighter watch, which we would have needed for a deeper impact.
Apple needs to do 3 things for the Vision Pro to be successful.
1. Convince enough people to buy one via halo use cases
2. Leverage or buy developer adoption
3. Create a decent enough developer experience to produce high quality apps
On 2 & 3, Apple has a proven track record, or at least amassing enough market share to force developers to ignore deficiencies in 3.
Which means 1 is going to be make-or-break.
The Apple Watch is a great analogy here, because it was evolutionary rather than revolutionary.
It did not let you do anything you couldn't before. It did let you do it better.
Consequently, this won't be (and doesn't need to be) an iPhone level smash success. It just needs to be volume and financially self-sufficient enough to get them to iteration N+X.
Because iteration N+X is "We shrink the iPhone down to a minimally-screened compute/network node, and the Vision SE becomes everyone's must-have companion, and then Apple owns a better-than-iPhone platform."
I think Apple made the right move in trumpeting its non-work use cases, because Apple has let macOS atrophy for enterprise use. And priced-for-work is a trap market they don't need to pursue (see: Microsoft).
But I don't know if most people want a better consumptive device $3500-badly... time, will tell.
Regarding #2 and #3, Apple has been working on this for years. ARKit for instance, is hugely gimmicky if not silly on iPhones, and LIDAR on the same had incredibly limited real world utility for that device. Yet for years they've been deploying millions of equipped devices, building it out, expanding the SDKs, doing developer outreach, and so on. They've even built shared AR spaces when the viewport is just a phone, again despite it being pretty goofy and of limited value.
They've been building towards this for years.
I suspect for most apps supporting the Vision Pro will be supporting variable resolutions (for resizable windows) and clicking a checkbox on the targets.
Surely $3500 and "Pro" in the product name implies they think it'll mostly be used for work? I didn't quite understand why they branded it that way given the heavy consumer focus in the demos. It implies that they intended for it to be a consumer device for a long time and got cold feet at the end when they realized they couldn't reduce the price.
For me it could go either way. I am willing to put up with wearing a headset if it unlocks some new use cases. But we really need to know the specs. I haven't seen anyone mention a field of view, refresh rate, pixel per degree, etc. And even with these specs I would need to actually try it to get a holistic feel for the product and its software.
> Because it's insanely hard to replace the simplicity of text buffers and a keyboard.
And why would you think this would seek to replace that, rather than complementing it? Fill up your entire vision with forty text buffers. Use a keyboard on a twelve-inch-deep shelf on one side of your bedroom with no monitor taking up space behind it.
You're making a bold assumption, that someone wants to wear this headset when relaxing. Also, a TV can be watched by multiple people, and a home cinema will obviously deliver better sound.
> For business use, the days of multiple displays and screen management seem set to be a relic of the past. I look forward to coding in an IDE which isn't constrained to a physical device sat on my desk, or replying to emails "on the beach" versus under fluorescent lighting.
The IDE is an interesting perspective I too as a developer am thinking about. But there's a reason you can be as productive in 2023 as in 1983 using emacs or vim. Because it's insanely hard to replace the simplicity of text buffers and a keyboard.
> In response to the obvious criticisms (high price, battery life, form factor)... this is Gen 1. Look how quickly the iPhone and Apple Watch evolved between generations 1 to 3, and look how the price changed as production capabilities and economies of scale evolved.
Smart watches have been anything but a groundbreaking technological revolution.