HybridFusion: Real-Time Performance Capture Using a Single Depth Sensor and Sparse IMUs

Zerong Zheng, Tao Yu, Hao Li, Kaiwen Guo, Qionghai Dai, Lu Fang, Yebin Liu; Proceedings of the European Conference on Computer Vision (ECCV), 2018, pp. 384-400

Abstract


We propose a light-weight and highly robust real-time human performance capture method based on a single depth camera and sparse inertial measurement units (IMUs). The proposed method combines non-rigid surface tracking and volumetric surface fusion to simultaneously reconstruct challenging motions, detailed geometries and the inner human body shapes of a clothed subject. The proposed hybrid motion tracking algorithm and efficient per-frame sensor calibration technique enable non-rigid surface reconstruction for fast motions and challenging poses with severe occlusions. Significant fusion artifacts are reduced using a new confidence measurement for our adaptive TSDF-based fusion. The above contributions benefit each other in our real-time reconstruction system, which enable practical human performance capture that is real-time, robust, low-cost and easy to deploy. Our experiments show how extremely challenging performances and loop closure problems can be handled successfully.

Related Material


[pdf]
[bibtex]
@InProceedings{Zheng_2018_ECCV,
author = {Zheng, Zerong and Yu, Tao and Li, Hao and Guo, Kaiwen and Dai, Qionghai and Fang, Lu and Liu, Yebin},
title = {HybridFusion: Real-Time Performance Capture Using a Single Depth Sensor and Sparse IMUs},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
month = {September},
year = {2018}
}