Slice Sampling Particle Belief Propagation

Oliver Muller, Michael Ying Yang, Bodo Rosenhahn; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2013, pp. 1129-1136

Abstract


Inference in continuous label Markov random fields is a challenging task. We use particle belief propagation ( PBP ) for solving the inference problem in continuous label space. Sampling particles from the belief distribution is typically done by using Metropolis-Hastings ( MH ) Markov chain Monte Carlo ( MCMC ) methods which involves sampling from a proposal distribution. This proposal distribution has to be carefully designed depending on the particular model and input data to achieve fast convergence. We propose to avoid dependence on a proposal distribution by introducing a slice sampling based PBP algorithm. The proposed approach shows superior convergence performance on an image denoising toy example. Our findings are validated on a challenging relational 2D feature tracking application.

Related Material


[pdf]
[bibtex]
@InProceedings{Muller_2013_ICCV,
author = {Muller, Oliver and Yang, Michael Ying and Rosenhahn, Bodo},
title = {Slice Sampling Particle Belief Propagation},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
month = {December},
year = {2013}
}