Deblurring by Example Using Dense Correspondence

Yoav Hacohen, Eli Shechtman, Dani Lischinski; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2013, pp. 2384-2391

Abstract


This paper presents a new method for deblurring photos using a sharp reference example that contains some shared content with the blurry photo. Most previous deblurring methods that exploit information from other photos require an accurately registered photo of the same static scene. In contrast, our method aims to exploit reference images where the shared content may have undergone substantial photometric and non-rigid geometric transformations, as these are the kind of reference images most likely to be found in personal photo albums. Our approach builds upon a recent method for examplebased deblurring using non-rigid dense correspondence (NRDC) [11] and extends it in two ways. First, we suggest exploiting information from the reference image not only for blur kernel estimation, but also as a powerful local prior for the non-blind deconvolution step. Second, we introduce a simple yet robust technique for spatially varying blur estimation, rather than assuming spatially uniform blur. Unlike the above previous method, which has proven successful only with simple deblurring scenarios, we demonstrate that our method succeeds on a variety of real-world examples. We provide quantitative and qualitative evaluation of our method and show that it outperforms the state-of-the-art.

Related Material


[pdf]
[bibtex]
@InProceedings{Hacohen_2013_ICCV,
author = {Hacohen, Yoav and Shechtman, Eli and Lischinski, Dani},
title = {Deblurring by Example Using Dense Correspondence},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
month = {December},
year = {2013}
}