This new imaging technology breaks the rules of optics
- Date:
- January 11, 2026
- Source:
- University of Connecticut
- Summary:
- Scientists have unveiled a new way to capture ultra-sharp optical images without lenses or painstaking alignment. The approach uses multiple sensors to collect raw light patterns independently, then synchronizes them later using computation. This sidesteps long-standing physical limits that have held optical imaging back for decades. The result is wide-field, sub-micron resolution from distances that were previously impossible.
- Share:
Imaging tools have dramatically reshaped how scientists study the world, from charting faraway galaxies with radio telescope networks to revealing intricate structures inside living cells. Even with decades of progress, one major obstacle has remained. At optical wavelengths, it has been extremely difficult to capture images that are both highly detailed and cover a wide area without relying on bulky lenses or ultra-precise physical alignment.
A newly published study in Nature Communications offers a possible way forward. The work was led by Guoan Zheng, a biomedical engineering professor and director of the UConn Center for Biomedical and Bioengineering Innovation (CBBI), along with his research team at the University of Connecticut College of Engineering. Their findings introduce a new imaging approach that could reshape how optical systems are designed and used across science, medicine, and industry.
Why Synthetic Aperture Imaging Falls Short in Optics
"At the heart of this breakthrough is a longstanding technical problem," said Zheng. "Synthetic aperture imaging -- the method that allowed the Event Horizon Telescope to image a black hole -- works by coherently combining measurements from multiple separated sensors to simulate a much larger imaging aperture."
This strategy has been highly successful in radio astronomy because radio waves have long wavelengths, making it feasible to precisely synchronize signals collected by widely spaced sensors. Visible light, however, operates on a much smaller scale. At those wavelengths, the physical precision required to keep multiple sensors perfectly synchronized becomes extraordinarily difficult, if not impossible, to achieve using conventional methods.
MASI and a Software-First Approach to Synchronization
The Multiscale Aperture Synthesis Imager (MASI) takes a fundamentally different approach to this challenge. Instead of demanding that optical sensors remain in exact physical alignment, MASI allows each sensor to collect light independently. Advanced computational algorithms are then used to synchronize the data after the measurements are complete.
Zheng compares the idea to a group of photographers capturing the same scene. Rather than taking traditional pictures, each photographer records raw information about how light waves behave. Software then combines these separate measurements into a single, extremely high-resolution image.
By handling phase synchronization computationally, MASI avoids the rigid interferometric setups that have long limited the practicality of optical synthetic aperture systems.
How Lens-Free Imaging Works in MASI
MASI departs from traditional optical imaging in two major ways. First, it eliminates lenses altogether. Instead of focusing light through glass, the system uses an array of coded sensors placed at different locations within a diffraction plane. Each sensor records diffraction patterns, which describe how light waves spread after interacting with an object. These patterns contain both amplitude and phase information that can later be recovered using computational techniques.
After each sensor's complex wavefield is reconstructed, the system digitally extends the data and mathematically propagates the wavefields back to the object plane. A computational phase synchronization process then adjusts the relative phase differences among the sensors. This iterative optimization increases coherence and concentrates energy in the final reconstructed image.
This software-based alignment is the central innovation. By replacing physical precision with computational optimization, MASI sidesteps the diffraction limit and other constraints that have traditionally governed optical imaging systems.
A Virtual Aperture With Sub-Micron Resolution
The outcome is a virtual synthetic aperture that is far larger than any individual sensor. This enables imaging with sub-micron resolution while still covering a wide field of view, all without the use of lenses.
Traditional lenses used in microscopes, cameras, and telescopes force engineers to make trade-offs. Achieving higher resolution usually means placing the lens extremely close to the object, sometimes just millimeters away. That short working distance can make imaging difficult, impractical, or even invasive in certain applications.
MASI removes that limitation by capturing diffraction patterns from distances measured in centimeters. The system can still reconstruct images with sub-micron detail. Zheng likens this to examining the tiny ridges of a human hair from across a desk instead of holding it just inches from your eye.
Scalable Imaging Across Science and Industry
"The potential applications for MASI span multiple fields, from forensic science and medical diagnostics to industrial inspection and remote sensing," said Zheng, "But what's most exciting is the scalability -- unlike traditional optics that become exponentially more complex as they grow, our system scales linearly, potentially enabling large arrays for applications we haven't even imagined yet."
The Multiscale Aperture Synthesis Imager points to a new direction for optical imaging. By separating measurement from synchronization and replacing heavy optical components with software-driven sensor arrays, MASI demonstrates how computation can overcome limits imposed by physical optics. The result is an imaging framework that is flexible, scalable, and capable of delivering high resolution in ways that were previously out of reach.
Story Source:
Materials provided by University of Connecticut. Note: Content may be edited for style and length.
Journal Reference:
- Ruihai Wang, Qianhao Zhao, Tianbo Wang, Mitchell Modarelli, Peter Vouras, Zikun Ma, Zhixuan Hong, Kazunori Hoshino, David Brady, Guoan Zheng. Multiscale aperture synthesis imager. Nature Communications, 2025; 16 (1) DOI: 10.1038/s41467-025-65661-8
Cite This Page: