Unitary Transforms Using Time-Frequency Warping for Digital Holograms of Deep Scenes This publication appears in: IEEE Transactions on Computational Imaging Authors: D. Blinder, C. Schretter, H. Ottevaere, A. Munteanu and P. Schelkens Volume: 4 Issue: 2 Pages: 206-218 Publication Date: Jun. 2018
Abstract: With the advent of ultrahigh-resolution holographic displays, viewing macroscopic deep scenes with large viewing angles becomes a possibility. These deep holograms possess different signal properties in contrast with common applications where the scene content is assumed to lie around a planar slice. Therefore, the conventional approach of refocusing at a fixed depth is ineffective. There is a need for an efficient invertible transformation that is able to account for the wide depth range of macroscopic three-dimensional scenes. To this end, we derive necessary invertibility conditions for the diffraction from nonplanar surfaces for symmetric light propagation kernels, such as Fresnel diffraction. We construct a unitary transform for modeling deep holographic scenes using a generalization of linear canonical transforms. From the symplectic properties of the timefrequency domain, we obtain invertibility conditions of the transforms depending on surface shape, hologram bandwidth, and wavelength. These transforms can be subsequently combined with other sparsifying transforms for compression. Experiments demonstrate one application in lossy coding of holograms by implementing a computationally efficient subset of the transforms for piecewise depth profiles that is combined with the JPEG 2000 codec. Results show improved reconstruction quality. A significant visual gain is observed as the depth information is well preserved under identical encoding rates in contrast to using Fresnel propagation at a fixed depth. This paper shows that it is possible to effectively represent holograms of variable-depth scenes and our local adaptive transform leads to a practical holographic compression framework.
|