Hacker News new | past | comments | ask | show | jobs | submit login
Lidar: Peek into the Future with iPad Pro (halide.cam)
218 points by 2bluesc on April 16, 2020 | hide | past | favorite | 70 comments



I would be curious to know if this could be used (with a heavy dose of software and a rotating mount) to replace a Matterport scanner for real estate agents. That would be a $2495-$3995 cost for device, a fixed cost of $19-$29 per house walkthrough and a $49-$149/mo hosting fee. Seems like a boatload of cash could be made for a first-to-market product here.


100%. You won't even need any extra hardware to accomplish that.

However, I would encourage anyone going this route to think closely about the incentive structures behind the product they're building, and validate them with real estate agents, photographers, and sellers. It's a lot more complicated than what many of us may give it credit for.


[flagged]


It actually sounds like they're saying it's definitely feasible and encouraging people to build the right product. Their comment from 7 months ago indicates this is something that's obviously important to them.

https://news.ycombinator.com/item?id=20753553


It reads more as cautious encouragement to me, but if you are correct, a disclosure would be appropriate.


Great question. In theory - yes I think it would be possible. But there is a lot of software involved in producing and distributing such scans. Matterport does a great job in packaging and distributing these scans in a way that is easy for real estate agents to produce & share, and for buyers to view. There is probably a great business out there, but it does require a lot of development work.


How common are those? I've looked at a lot of properties and am yet to run across a 3D scan, but it may be very ___location dependent/value of property.


Very common in my area. When we just put our previous home up for rent, the agent had somebody come in, scan and photo the house and it was up the next day with virtual 3d walkthroughs and everything. We kinda couldn't believe they would do that just for a rental.


Side effect - these become documents of record if you damage/modify a rental property.


After one ugly experience with a dirtbag landlord, I started taking my own photos and had the landlord sign/date them before moving in.

Always have your own documentation.


Given how absurdly touched up real estate photos can be I really don’t know if that would hold up in court.


If you haven't seen 3D scans yet, you will soon. Virtual walkthroughs surging due to COVID.


I live in a large midwestern city. They aren't common by any means but they definitely exist. I'm sure more expensive properties are more likely to have features like this but I've seen starter homes in the mid $100k with scans.

They really helped me as a buyer and while I don't know what they cost, I think I'd spring a couple hundred dollars to have it whenever we sell our house.

I think the biggest problem is that as a seller, I want qualified leads, but I get the feeling realtors just want leads because there's always another house. This mis-alignment of incentives means that in my experience listing agents don't always want to take steps that would limit foto traffic to a property, even if it would be excluding parties that probably wouldn't buy.


Agreed, can't say I've seen them in use, though not deep into the housing market.

However in the current climate and social adjustment afterwards - such services and uses will certainly see an increase in usage and virtual viewings may well be more the norm.


I've seen some. They aren't actually that useful though. Unless your prospective buyer has a VR headset. Even then, a good floorplan has all the info I need.


The longer Apple goes without releasing their AR headset the weirder adding AR to the iPads and iPhones as some sort of developer platform test bed becomes.

It's fine for developers but for end users these handheld AR experiences are so frustratingly poor, frustrating in the sense that the tracking and occlusion tech is getting really good but poor that you're experiencing the whole thing through this silly handheld porthole to the world.

Just starting to get stranger and stranger that every year for around 5 years now we've had to have Apple showing off these features and we nod along knowing we'll never ever use them but understand they're only there for developers, like this lidar sensor.


Apple's not going to launch a consumer feature until they think it's insanely great. Until then they need developers to keep working at it.


That's my issue, they keep launching these close to useless features because they're being developed for another product.

That's why this feels un-Apple


Wish they'd taken that approach with Catalina...


different teams, and plus the macos team at apple is notorious for releasing buggy and half baked software.


Are the demos impressive in a way I'm failing to spot? It still seems about on par with others and still has plenty of noise in the pop art chair and kitchen. I was hoping this would put the magic leap to shame but it doesn't seem much better.


They’re showing off the environment scanning capability. The visualisation system is just something they knocked together quickly with some basic APIs, so it would be a bit much to expect ML equivalence.


Hmm I suppose you're right that the mesh generation system could be hiding some of the capability of the lidar. I expected the floating, mid-air meshes seen in the kitchen demo to go away but it could be a software issue.


I think the meshes are what the LIDAR API provides to the application.


Yup, flash lidar has some ways to go before it can become usable, in my opinion. I don't think there's much hope. I shouldn't say more!

Instead of using the time to determine the distance as typical lidar does, it uses the phase difference, and requires a precise phase measurement. This unfortunately is not nearly as accurate, especially with such a low-powered laser.


> Instead of using the time to determine the distance as typical lidar does, it uses the phase difference, and requires a precise phase measurement. This unfortunately is not nearly as accurate, especially with such a low-powered laser.

To be honest, I'm a little curious about this because it seems exactly the opposite to me. In my experience, area scanners use phase shift methods to measure, but I know there's a lot of variety out there that I'm not exposed to. Low-powered lasers would, I expect, limit working distance--but I'm a little surprised to find it would have significant impacts on the measurement method.


I knew I recognized that name! Sebastiaan de With, who famously (in my mind) traveled from SF to the north shore of Alaska. I looked that post up the other day, from 2014, because I have a nice camera and a KLR - and I dream. [1]

When I read the iPad Pro announcement, the trifecta of Real Keyboard/iPadOS improvement/LiDAR got my attention. With the pandemic, I have found myself working with laptop actually in my lap, so I don't know how stable an iPad+keyboard would be. Otherwise, I would get it today to be my next personal workbook.

The opportunities for AR are incredible - that demo of walking around a scan is amazing. Interior design certainly looks like one of the first places that could benefit.

[1] https://imgur.com/a/J7kZJ


Sebastiaan's original post about this album, including his replies to comments and questions: https://www.reddit.com/r/pics/comments/2gejnr/got_divorced_l...


Pretty excited to see what this does to the price of Lidar tech. I used to hear (need to validate with more research) that GPS and gyros/imu's are so cheap because of the need to get them so small and production friendly for so many mobile phones. I think Pixel 4 also has Lidar so I hope this trend continues.


I think LIDAR could be very cool but as the article alludes it definitely has the "Chicken & Egg" problem is that you need the hardware so developers can develop, and you need the software to justify having the hardware. Sounds like even the API isn't on-tap quite yet.

It would definitely be cool if you could e.g. walk around a room and scan it into a home re-modelling app. I know the stated aim is more AR, and that's fine, but I can see apps using LIDAR far beyond that for example a clothing app scanning your body to determine perfect size.

Usage though depends a lot of level of detail, particularly using it to scan people instead of surfaces.


>it definitely has the "Chicken & Egg" problem is that you need the hardware so developers can develop, and you need the software to justify having the hardware.

Although it's worth pointing out that the "Chicken & Egg" problem actually has a real life answer that applies equally to a lot of scenarios it's posited. In terms of "which came first" the answer is definitively eggs, with macrolecithal eggs and birds themselves evolving from dinosaurs and other previous ancestors. Evolution sees use of "existing hardware" which then undergoes incremental change, competitive ones of which are kept.

Similarly, in a developed platform you don't actually need software to justify incremental advances in hardware. The justification for the hardware is all the other already useful stuff it does and has inherited from previous generations. Apple, and other established platform companies, can get away with bootstrapping from the hardware side that way. Just as with other evolutionary processes sometimes there are even things that just fail to stick (like Force Touch) for better or worse, but even dead ends don't kill the whole platform. I guess it's also kind of another side of a certain degree of dominating market power. Often it's used for ill, but sometimes certain kinds of monopolies and lock-in can also give companies more freedom to experiment and look for longer term gains even if it's not justified in the next few quarters.

At any rate work got a few of these and it wasn't for LIDAR at all, wouldn't change the value if it never did anything. Except for an odd network issue I'm working through they're nice upgrades from previous iPad Pros so far. But from this demo LIDAR certainly could be put to real use in some architectural brainstorming for example, so happy to see it there.


I wonder if this is a reaction to Soli from Google. The use case isn't the same but it's related. It's also got a real chicken-egg problem. Typically they'd preview this for some developers to have at least demo apps available at launch but idk what they're cooking up here.


I think LIDAR in the iPad/iPhone is going to be as useful as the touchbar on Mac laptops.


Ok, this is tangential but there’s a lot of people in this thread who seem to have a good knowledge of lidar:

How close can you get to creating a decent topographical survey of your property with what’s available to consumers? Could you walk around with an iPad Pro and build a decent 3D model? Is there a not exorbitant drone solution?


In the geosciences there has been a bit of research in the past 10 years using Structure from Motion techniques on images collected from normal cameras suspended from balloons, if a drone isn't available.

Both of these papers discuss making 3D models of topography on the scale of a house and yard. You can also probably get colors mapped onto the surfaces.

(Note that I believe both use AgiSoft structure from motion software, which is probably expensive. I have no idea what the FOSS options are.)

[1]: https://pubs.geoscienceworld.org/gsa/geosphere/article/10/5/...

[2]: https://www.researchgate.net/profile/Steven_Micklethwaite/pu...


I work at a company that does lidar SLAM. You can actually produce really high quality maps/slam with lidar/tof sensors, and it’s a lot more robust/dense than visual/imu mapping.


How does the price compare to traditional survey methods?


Depends on what you’re comparing to what. Big spinning lidars on cars are still expensive (few thousand dollars) but are coming down on price. Handheld 3D scanners on the other hand might start to become obseleted by a combination of cheap phone camera + lidar (really tof) sensors that are evidently cheap enough to put on phones. They can actually produce really high quality (SLAM) maps - Apple has figured out that they can have a much faster initial mapping phase by not having to do monocular mapping for their AR. So I guess I’m saying it’s cheaper and a bit worse, but it’s rapidly getting better.


Has anyone seen specs of the Lidar sensor? I'm coming from GIS background (geography) where Lidar scanning is very often used for ground measurements and would like to compare specs with the gear I know. I couldn't find any on Apple's website and those "tech blogs" don't tell much either.


Wonder how well it would work as night vision?

Also my biggest wish with this tech is to be able to use with a 3D printer when you need to add a replacement part or some sort of shim.


one of the "real" applications of Lidar, which is aerial photogrammetry, is to map terrain from the sky. It is very often done at night, as there is very little traffic. You should be fine with measuring objects in the dark with this iPad Lidar as far as I can tell.

About 3D printing - the iPad Lidar seems to have a _very_ low resolution, so for 3D printing you should wait for some next generations of micro-lidars.


I just got one of these and it's very strange to me Apple did not include any apps on the iPad itself that demonstrates the Lidar hardware. They go to all of the trouble of putting it in there, and then when you turn it on and it's as if it doesn't even exist. Why not just include an app that allows me to play with the data stream?


A a previous comment called it - chicken and egg situation. But at least we have the chicken, so the eggs will come.

Though for nothing built in to use or avail in any way this tech, sure does seem a bit surprising, though now the HW is there, they will certainly have the software eventually.

The cynic in me says that Apple will wait for some wonder app and then buy or copy it. I say cynic as there have been a few instances of that transpiring previously with Apple - https://www.telegraph.co.uk/technology/apple/8565673/Apple-a...


I mean, the whole reason I bought it was to use it to make neat apps so I guess they figured that's what everyone would do. But even with something like the Kinect, Microsoft provided a suite of apps that at least demoed the apis and allowed you to look at the raw sensor data. Nothing like that is even available on the app store as far as I can tell. Apple even demoed a number of apps using it (some anatomy app, a home scanning app, some AR game with lava), but you can't even download those.


Honestly, if there is nowt there atm, then a basic simple knocked up app to show what it can do, even very basic - pop on app store for something like a dollar or less and lap up the sales from those curious like everybody else.

As you have the device and plan on doing app stuff, I'd go for that gold rush.


Isn’t there an RF communication widget in iPhone11 too, that sounds really promising for positioning and whatnot, but still isn’t used or marketed?


There is. They also didn't put it in the new SE for some reason. The thought it that it might be used for a future "airtags" product that you can stick in your wallet or on your keychain and your iphone helps you track it.


Something tells me that one has to wait for those now already mythologised Apple VR glasses.


the measure app has special features in regards to the Lidar sensor. It will tell you how tall someone is if you point at them, and it will also resolve to a ruler when you get close to a measurement.


Lidar tech sounds really nice. But as someone how still tries to figure stuff out. Where is the difference of a Time of Flight Sensor and a Lidar sensor? Where are the differences and what are the advantages between these two sensors?


Lidar is a time of flight sensor. “Time of flight” is the name for the method in which they time The pulses yo sense depth. In practice, when someone says “time of flight camera”, they’re referring to a broader class of devices that use similar methods. Lidar usually refers to a specific class of time of flight sensors.


Thank you so much. Makes totally sense now.


You have great many lidars around, TOF sensors are the entry class category, usually with short range.

Serious sensors usually use continuous wave operation, and some modulation scheme


If anyone from Halide is reading this there's a slight typo in the copy:

> This is the first time a new imagining capture technology has appeared on iPad before iPhone.

I believe that should read "... imaging ..."


You can let them know directly: [email protected]


If this is using a projected pattern, then it's not lidar -- it's probably a form of projected stereo. I.e. it's not a true time-of-flight measurement.


This is a laser based ToF sensor, not pattern projection.


My mistake. Thanks for the clarification.


Nice article. Reading about Lidar’s capability when paired with a mobile device makes Google Pixel’s AR sensing seem obsolete overnight.


... the name LIDAR is a play on RADAR, but instead of using dangerous RAdio waves, it uses infrared light. hmmm...


For people downvoting this, this is actually a quote from the article. I'm honestly curious if this was serious, or a joke that went over my head.


I've had a play around with the new ipad. The big problem is (as far as I can see), absolutely no way for a developer to get access to the underlying depth data.

I'm assuming it's coming in the next version of iOS, because you _can_ get access to the faceid depth data in a useful format.


Is it not available through 'AVCaptureDepthDataOutput`? My understanding is that depth data is a separate channel stored in photos.


Maybe I screwed it up, I’m not the best developer ever. I took their sample code to extract the depth data from front camera and this worked - switching to .back caused it to return a nil device, so didn’t really know where to go after that.


Unfortunately for someone like me who’s not enthusiastic about photography at all and takes maybe a dozen photos a year, every time I buy a new iPhone or in this case an iPad Pro I feel like I’m wasting hundreds of bucks on a camera I’m not really gonna take advantage of.


If they built a version without cameras for the small subset of people who don’t even want a camera for FaceTime/video conferencing, chances are it would be more expensive than what you paid now. Apple would have to make separate backplanes, boxes, marketing material, etc, would have to create a separate iOS release (they couldn’t ship a camera app with the system, for example, and would have to change the UI that allows picking a photo from either storage or camera.

I guess it also would break many apps. I don’t think there ever was an iOS device without a camera, so nobody is checking for its presence.

For example, your banking app that wants you to take a photo of a QR code for 2FA would break.


There’s a huge chasm between no camera and pouring huge amounts of R&D into shipping a cutting edge camera every year and marketing entirely around that. I’m not sure how you jumped from my post to “a version without cameras”, as I did say I take a nonzero number of photos (mostly utility shots). One hell of a straw man.


They could delete the camera and related sensors and it would not appreciably affect the price of the tablet. The camera is probably something like 5 bucks in quantity, and I doubt the "LIDAR" sensor is particularly expensive either.


Actually that's not true -- some quick searching reveals iPhone cameras cost $30-50, generally about 10% of the total BOM. Which means that for a ~$1,000 tablet, it would be ~$900 if it didn't have cameras.

So it's not doubling it or anything... but it's not nothing.

(On the other hand, the vastly lower-quality webcam cameras built into Macbooks are probably closer to the $5 you're suggesting.)


The iPad cameras in this article are significantly lower-quality than iPhone cameras, other than the LiDAR sensor. So it's definitely cheaper than the larger sensors and optics in new iPhones.


Cost of something isn’t just the parts, you forgot all the R&D, labor, and premium. iPhone 11 and 11 Pro differ by $300 and the better camera is one of the two (?) major selling points; not to mention the iPhone 11 has a pretty good camera too.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: