Learning Time-multiplexed phase-coded apertures for snapshot spectral-depth imaging

Opt Express. 2023 Nov 20;31(24):39796-39810. doi: 10.1364/OE.501096.

Abstract

Depth and spectral imaging are essential technologies for a myriad of applications but have been conventionally studied as individual problems. Recent efforts have been made to optically encode spectral-depth (SD) information jointly in a single image sensor measurement, subsequently decoded by a computational algorithm. The performance of single snapshot SD imaging systems mainly depends on the optical modulation function, referred to as codification, and the computational methods used to recover the SD information from the coded measurement. The optical modulation has been conventionally realized using coded apertures (CAs), phase masks, prisms or gratings, active illumination, and many others. In this work, we propose an optical modulation (codification) strategy that employs a color-coded aperture (CCA) in conjunction with a time-varying phase-coded aperture and a spatially-varying pixel shutter, thus yielding an effective time-multiplexed coded aperture (TMCA). We show that the proposed TMCA entails a spatially-variant point spread function (PSF) for a constant depth in a scene, which, in turn, facilitates the distinguishability, and therefore, better recovery of the depth information. Further, the selective filtering of specific spectral bands by the CCA encodes relevant spectral information that is disentangled using a reconstruction algorithm. We leverage the advances of deep learning techniques to jointly learn the optical modulation and the computational decoding algorithm in an end-to-end (E2E) framework. We demonstrate via simulations and with a real testbed prototype that the proposed TMCA strategy outperforms state-of-the-art snapshot SD imaging alternatives in both spectral and depth reconstruction quality.