Hacker News new | past | comments | ask | show | jobs | submit login
Engineers produce a fisheye lens that’s completely flat (news.mit.edu)
143 points by chmaynard on Sept 18, 2020 | hide | past | favorite | 38 comments



You'd expect at least a passing reference to Fresnel lens in a piece that talks about flat lenses.

https://en.wikipedia.org/wiki/Fresnel_lens


Here is a video explaining the difference between a Fresnel lens and a photon sieve: Photon sieves used to visualise optical wave front propagation

https://www.youtube.com/watch?v=TshYfYIxR9E


I agree, this lens looks like a Fresnel lens taken to microscopic extremes:

the new fisheye lens consists of a single flat, millimeter-thin piece of glass covered on one side with tiny structures that precisely scatter incoming light to produce panoramic images


It’s not. The mechanism by which it acts is quite different. The lens is a metasurface lens, which has small structures that cause the phases of light to constructively or destructively interfere. The result is caused by the fact that, by carefully picking how light interferes, one can form an image on the other side of the lens. This differs from a Fresnel lens (or other classical lenses), which essentially form images by having rays of light which emerge from one spot, on one side of the lens, converge to another spot on the other side.

(In particular, wave theory is not needed to predict the behavior of classical lenses.)


> [...] which has small structures that cause the phases of light to constructively or destructively interfere

This is slightly unclear. I meant more specifically:

[...] which has small structures that change the phase of the light, which, in turn, causes it to constructively or destructively interfere [...]


Well, a fresnel lense (or any lense) also just changes the phase of which in turn causes it...

But i will read the paper later.


Yeah, I should have said "subwavelength" structures, since I really didn't specify what "small" meant, but thought it was getting too technical.


>> this lens looks like a Fresnel lens taken to microscopic extremes:

Except I don't think it is. It's probably more like a diffraction grating or a hologram. You can actually make a flat lens by creating a hologram of a real lens and it will be flat.


That sounds like it would make it ridiculously easy to make VR headsets that are as thin as the prop glasses used in Westworld.

What are the downsides?


Wavelength specificity. You only get the original focal depth at the wavelength you used to make the hologram, and it changes proportional to the wavelength. When you consider that blue light is 2/3 the wavelength of red light, you can see that this is chromatic aberrations big brother.


Diffractive elements are very wavelength dependent so while something like this would only work for monochromatic light (like a laser), but not for multiwavelength light. For that you would need to do some colordiversity.


You can use the Fresnel lens in an old projection TV as a laser.

https://www.youtube.com/watch?v=OmlXR2nsis4


Laser is light amplification by stimulated emission of radiation. Fresnel lens doesn't amplify light it focuses it. It is not stimulated, it is just passive component.


This is true, but I don't think many people think of laser as anything other than "dangerous light." That's how I used it. Colloquial meanings can be frustrating when they clash with the technical meaning, but trying to fight them is typically futile.

https://www.merriam-webster.com/dictionary/laser

>> "2 : something resembling a laser beam in accuracy, speed, or intensity "

I gave up on fighting "CPU means the entire computer" a long time ago.


> This is true, but I don't think many people think of laser as anything other than "dangerous light."

"Many people" have taken first year physics, or at least read a wikipedia article, and understand that the defining feature of a laser is its coherence.

But, sure. It's 2020. We live in a post-factual society, words have no meaning and it's impossible to know anything. Why resist?


I suspect "nanoscale structures" means that it's going to be highly sensitive to the wavelength of light, I'm interested in seeing what kind of chromatic aberration is going to come out from this.


I'm glad somebody pointed this out! Meta materials and diffractive structures are not known for being low dispersion. This article mentions nothing about bandwidth. However, I think the writer assumes that the indented reader knows this already.

Also in the video you can see spatial ringing, which is highly indicative of coherent light (single wavelength, like in a laser.)

This is still a cool discovery and could definitely have a lot of important applications, but it's important to clarify what this is, and what it isn't.


In the video and the top image, the caption says

> 3D artistic illustration of the wide-field-of-view metalens capturing a 180° panorama of MIT’s Killian Court and producing a high-resolution monochromatic flat image.

My guess is that it is only good for monochromatic light.


>My guess is that it is only good for monochromatic light.

Does anyone know how much of a limitation that'd be in practice though? CCDs are monochromatic too, which we deal with using either Bayer filters (usually RGGB IIRC) or (in fancier high end stuff) a splitter prism leading to 3 separate CCDs. Modern computational photography is also increasingly able to do sensor fusion, even between different cameras. I can see how making use of this would still be an extra challenge in a small form factor, because normally the split/filter happens after the lens which simplifies things a lot. Having to do 3x multisensor fusion with meta material fisheye only definitely would be more effort and bulkier. Seems like it might still be pretty useful though depending on final cost? Otherwise maybe it'll end up niche, but it's really cool research anyway.


CCDs aren't monochromatic though: they detect a wide band of frequencies, they just can't tell the difference. High dispersion in the lens is a problem because a blue photon and a red photon originating from the same spot on the same object, hitting the same spot on the lens, will be deflected to different spots on the CCD. This causes fringing and fuzziness in the image.


Definitely very cool. Metamaterials in general are fascinating to me and when I was deep into photography optics was particularly interesting.


Well you 'only'need 1 octave


Absolutely. Would be nice to know the bandwidth.


It looks impressive but given the standard exaggerated claims of university press offices it’s hard to say if it’s really groundbreaking.


Yeah. It currently works in infrared (though they say it can be modified to visible light) and the video from the lab shows the images to look quite noisy and blurry, honestly.


Depending on the wavelength of the infrared even noisy and blurry may be notable or it may be mediocre for their setup. All the important details are in the paywalled paper so which type of infrared they calibrated for is unknown.

Just too much stuff missing from the article to actually judge for technical merit.

And plus making one for visible light is likely the main challenge as micro machining precision optics is not subject to moore’s law type scaling. Though I wouldn’t put it past Apple to sink billions into it if it helps making camera modules a few mm thinner.


> a few mm thinner

Isn't a typical consumer fish eye lens for an iPhone a couple of cm in depth? Sounds like we're talking a more than a few mm here


At the claimed 180 degree field of view? Yes. Fisheyes that wide have lenses that are not only incredibly thick, but also bulbous enough that protecting the lens is quite difficult.


For an add-on lens yes. It’s probable Apple already has prototype lenses optimized for thinness that are less than a cm thick, since a lot of the bulk is unnecessary if it’s directly integrated to the sensor.


Definitely still in the preliminary research phase; canon won’t be releasing a lens using this anytime soon.


There's a youtube channel doing experiment on this on his garage. https://www.youtube.com/user/huygensoptics


Sounds like this solves a problem in VR too if the lens work for visible light or at least contributes to solving it?


360 cams currently use two back-to-back fisheye lenses... It would be great to make them even smaller.


Does sound impressive. Even with UV-light-only, it would feel like military uses abound.

If it works for visible light too, Apple seems like a ripe buyer. Supports a thin enclosure and an obvious camera upgrade thus a win for them.

Will be interesting if it's a gross exaggeration.


> covered on one side with tiny structures

Click bait. Not "completely flat" at all.


I believe the "flat" was meant in contrast to "curved", as in typical fisheye/ultra-wide-angle lenses.

> the new fisheye lens consists of a single flat, millimeter-thin piece of glass covered on one side with tiny structures that precisely scatter incoming light to produce panoramic images.

With a thinness of a millimeter, I'd say that's flat. The "tiny structures" are in the form of a film, which further qualifies as flat.

> Their new metalens is a single transparent piece made from calcium fluoride with a thin film of lead telluride deposited on one side. The team then used lithographic techniques to carve a pattern of optical structures into the film.

> Each structure, or “meta-atom,” as the team refers to them, is shaped into one of several nanoscale geometries, such as a rectangular or a bone-shaped configuration, that refracts light in a specific way.


You should look at glass (something 99% of people would call "completely flat") under a microscope.

The 3µm-wide structures in TFA deviate <1µm from the surface, so it's even flatter than glass, and your objection sounds rather petty.


"The lens appears completely flat" would be a much better description, because the unflatness of the lens is the very thing that makes it work.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: