332
1.7k share, 332 points

Advancements in Coherent Imaging Through Large-Scale Complex-Domain Neural Networks

Computational imaging holds the promise of transforming optical imaging, providing expansive field-of-view and high-resolution capabilities. The joint reconstruction of amplitude and phase, referred to as “coherent imaging” or “holographic imaging,” extends the optical system’s throughput to billions of resolvable spots. This groundbreaking development facilitates crucial insights into cellular and molecular structures for biomedical research.

Despite its potential, current large-scale coherent imaging methods encounter challenges for widespread clinical application. Many of these techniques necessitate multiple scanning or modulation processes, leading to prolonged data collection times required for achieving high resolution and signal-to-noise ratio. This decelerates the imaging process and hampers its practicality in clinical settings due to inherent tradeoffs between speed, resolution, and quality.

Recent advancements in image denoising present a possible solution by employing denoising algorithms during iterative reconstruction to enhance imaging quality with sparse data. However, traditional methods are computationally intricate, and deep learning-based techniques often suffer from poor generalization and compromise on image details.

Complex-domain neural network empowers large-scale coherent imaging. Credit: Xuyang Chang.

In a study featured in Advanced Photonics Nexus, a team of researchers from the Beijing Institute of Technology, the California Institute of Technology, and the University of Connecticut showcased a sophisticated neural network operating in the complex domain, substantially elevating the capabilities of large-scale coherent imaging. This breakthrough introduces new avenues for achieving high-quality coherent imaging with low sampling in diverse modalities. The approach leverages latent coupling information between amplitude and phase components, generating multidimensional representations of complex wavefronts.

The framework exhibits robustness and excellent generalization across a spectrum of coherent imaging modalities. The researchers devised a network incorporating a two-dimensional complex convolution unit and complex activation function. Additionally, they formulated a comprehensive multi-source noise model tailored for coherent imaging, encompassing speckle noise, Poisson noise, Gaussian noise, and super-resolution reconstruction noise.

This multi-source noise model enhances the network’s domain-adaptation capability from synthetic data to real data. The described technique found application in various coherent imaging modalities, such as Kramers-Kronig relations holography, Fourier ptychographic microscopy, and lensless coded ptychography. Through extensive simulations and experiments, the technique demonstrated its ability to maintain high-quality reconstructions and efficiency while significantly reducing exposure time and data volume—achieving a noteworthy order of magnitude reduction.

These high-quality reconstructions carry substantial implications for subsequent high-level semantic analyses, including precise cell segmentation and virtual staining, potentially catalyzing the advancement of intelligent medical care. The prospect of swift, high-resolution imaging with diminished exposure time and data volume holds potential for real-time cell observation. Furthermore, when integrated with artificial intelligence diagnostics, this technology has the capacity to unveil the intricacies of complex biological systems, pushing the boundaries of medical diagnostics.

This article is republished from PhysORG under a Creative Commons license. Read the original article.

Do not forget to share your opinion with us to provide you with the best posts !

 

 


Like it? Share with your friends!

332
1.7k share, 332 points

What's Your Reaction?

Dislike Dislike
1784
Dislike
love love
1189
love
omg omg
594
omg
scary scary
297
scary
wtf wtf
2378
wtf

0 Comments

Your email address will not be published. Required fields are marked *