Researchers have developed a nanophotonic light field camera that's based on the eye of an extinct Cambrian-era trilobite, Dalmanitina socialis, that delivers an extreme depth of field. The camera has a depth of field ranging from a centimeter to a kilometer scale.
The camera utilizes a multiscale convolutional neural network-based reconstruction algorithm to eliminate optical aberrations that result from the camera's 'metalens' array. The project is described in excellent detail in a new research paper, 'Trilobite-inspired neural nanophotonic light-field camera with extreme depth-of-field. ' The full research paper is available at Nature.
Dalmanitina socialis have been extinct for several hundred million years. However, investigations of fossil remains have revealed that the trilobite was one of the earliest arthropods to have compound eyes. Many organisms today have compound eyes, including insects and crustaceans. However, Dalmanitina socialis had a unique compound eye visual system. It had 'two optically homogeneous lens units of different refractive indices, an upper lens unit with a central bulge made of calcite and a lower lens unit made of an organic compound. As a result, each compound eye of Dalmanitina socialis is able to simultaneously focus incident light to a near and a far point, analogous to a coaxial bifocal lens…' This distinct eye structure allowed the organism to simultaneously see prey up close and distant predators. All modern arthropods instead have a single focal vision system.
Fig 1. a shows the unique eye structure of Dalmanitina socialis.
Turning our attention to light field cameras, they're capable of capturing information about a scene's light field and measuring the intensity of light and the direction that light travels. The research paper describes, 'Light-field cameras could measure a rich 4D representation of light that encodes color, depth, specularity, transparency, refraction, and occlusion. DoF and spatial resolution are two key system parameters in light-field photography. ' Conventional cameras record light intensity, too, but not light direction.
a Conceptual sketch of the proposed light-field imaging camera. b Schematic diagram of the working principle of the system with metalens array achieving spin-dependent bifocal light-field imaging. Either the LCP component of close object or the RCP component of distant object could be focused well on the identical imaging plane. The nominal distance between the primary lens and metalens array is c The captured PSFs at different depths for LCP, RCP, and UP (unpolarized) incident light. d Demonstration of working range for different polarization states. The light-blue region and light-red region represent the working range of LCP and RCP components, respectively. The vertical axis represents the PSF ranks, for which the smaller value corresponds to better imaging quality. The uncertainties are standard deviation for repeated measurements (six in total).
There are different approaches to lenses for light field cameras, but one option is to use a micro lens array at the focal plane. This is the approach Lytro took with its light field camera. The microlens approach works well for the depth of field but has limits concerning resolution. Moving the lenses further from the focal plane helps with the resolution, at the cost of depth of field. The researchers, inspired by the Dalmanitina socialis, describe their design:
Here, inspired by the optical structure of bifocal compound eyes found in Dalmanitina socialis, we demonstrate a nanophotonic camera incorporating a spin-multiplexed metalens array able to achieve high-resolution light-field imaging with a record DoF. The proposed spin-multiplexed metalens array provides two completely decoupled transmission modulation to a pair of orthogonal circular polarization input, and thus can simultaneously capture light-field information for both close and distant depth ranges while maintaining high lateral spatial resolution. Consequently, light-field information over large DoF can be computationally reconstructed from a single exposure. In addition, inspired by the biological neural aberration compensation mechanism, we introduce a distortion-correction neural network to eliminate the aberrations, which significantly relaxes the design and performance limitations on metasurface optics. As a result, the proposed camera system is capable of achieving full-color light-field imaging with a continuous DoF ranging from 3
Returning to the term 'metalens' mentioned earlier. This refers to a special optical design that's flat but uses 'metasurfaces' to focus light. The metasurfaces operate at a sub-wavelength level and use extremely tiny structures to scatter light. Millions of these nanostructures, which are two-hundredths the diameter of a human hair, are arranged in a parallel fashion such that light travels through different parts of the lens. The nanostructures, inspired by the ancient trilobite's compound eyes, bend light from near and far objects to a single focal plane.
a PSF capture and training-data generation. b Aberration removal with the proposed multiscale deep convolutional neural network. The distance between the primary lens and Matryoshka nesting dolls: 0. 3
The image itself offers decent results. However, by utilizing a convolutional multiscale neural network to correct for aberrations, the final image is impressively sharp. The optics and software work hand-in-hand, which is typical of light field cameras. What's atypical of prior cameras is the impressive resolution paired with extreme depth of field.
a, b Captured light-field subimages of the whole scene under natural light (a) before and (b) after aberration correction. c, d Zoomed-in subimages of different objects corresponding to the marked ones shown in (a, b), respectively. e Aberration-corrected all-in-focus image after rendering. The reconstructed NJU characters have been reasonably shifted and scaled for easy viewing.
There is much more detail in the full research paper at Nature. The research paper is titled 'Trilobite-inspired neural nanophotonic light-field camera with extreme depth-of-field' and its authors are Qingbin Fan, Weizhu Xu, Wenqi Zhu, Tao Yue, Cheng Zhang, Feng Yan, Lu Chen, Henri J. Lezec, Yanqing Lu, Amit Agrawal and Ting Xu.
All figures and images are from the research paper, 'Trilobite-inspired neural nanophotonic light-field camera with extreme depth-of-field. '
. dpreview.com2022-5-5 20:22