CMOS Computational Camera Sensors for Imaging Through Scattering Media
Computational imaging as a field has exploded in the past three decades, as new methods and techniques have been developed to overcome traditional digital imaging limitations. The focus of most of this progress has been in either the pre-sensor optics and illumination, or in the post-sensor software and data processing. Little attention has been given to the sensor itself. This work aims to fill that gap. I present two novel CMOS image sensors for use in computational cameras which are designed to detect time as a descriptive dimension of light. The first sensor exploits the wave nature of light as well as the ability to build nano-scale diffraction gratings directly into the integrated circuit. The prototype demonstrates a technique to directly capture phase shifts of a light field, which can be used for holography or optical coherence tomography applications. The second is based upon the particle nature of light, and addresses the practical challenges of data bandwidth and silicon footprint for developing megapixel arrays of single-photon sensors. This work builds towards the goal of developing full imaging systems that can, in a compact format, see through scattering media such as fog or human tissue.