Depth-Based View Synthesis Using Pixel-level Image Inpainting Host Publication: 18th International Conference on Digital Signal Processing (DSP 2013) Authors: S. Lu, J. Hanca, A. Munteanu and P. Schelkens Publisher: IEEE Publication Date: Jul. 2013 Number of Pages: 6 ISBN: 978-1-4673-5806-4
Abstract: Depth-based view synthesis can produce novel realistic images of a scene by view warping and image inpainting. This paper presents a depth-based view synthesis approach performing pixel-level image inpainting. The proposed approach provides great flexibility in pixel manipulation and prevents random effects in texture propagation. By analyzing the process generating image holes in view warping, we firstly classify such areas into simple holes and disocclusion areas. Based on depth information constraints and different strategies for random propagation, an approximate nearest-neighbor match based pixel-level inpainting is introduced to complete holes from the two classes. Experimental results demonstrate that the proposed view synthesis method can effectively produce smooth textures and reasonable structure propagation. The proposed depth-based pixel-level inpainting is well suitable to multi-view video and other higher dimensional view synthesis settings.
|