Dense View Interpolation on Mobile Devices using Focal Stacks

Parikshit Sakurikar, P. J. Narayanan; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2014, pp. 138-143

Abstract


Light field rendering is a widely used technique to generate novel views of a scene from novel viewpoints. Interpolative methods for light field rendering require a dense description of the scene in the form of closely spaced images. In this work, we present a simple method for dense view interpolation over general static scenes, using commonly available mobile devices. We capture an approximate focal stack of the scene from adjacent camera locations and interpolate intermediate images by shifting each focal region according to appropriate disparities. We do not rely on focus distance control to capture focal stacks and describe an automatic method of estimating the focal textures and the blur and disparity parameters required for view interpolation.

Related Material


[pdf]
[bibtex]
@InProceedings{Sakurikar_2014_CVPR_Workshops,
author = {Sakurikar, Parikshit and Narayanan, P. J.},
title = {Dense View Interpolation on Mobile Devices using Focal Stacks},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2014}
}