BLADE: Single-view Body Mesh Estimation through Accurate Depth Estimation

Shengze Wang, Jiefeng Li, Tianye Li, Ye Yuan, Henry Fuchs, Koki Nagano, Shalini De Mello, Michael Stengel; Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR), 2025, pp. 21991-22000

Abstract


Single-image human mesh recovery is a challenging task due to the ill-posed nature of simultaneous body shape, pose, and camera estimation. Existing estimators work well on images taken from afar, but they break down as the person moves close to the camera. Moreover, current methods fail to achieve both accurate 3D pose and 2D alignment at the same time. Error is mainly introduced by inaccurate perspective projection heuristically derived from orthographic parameters. To resolve this long-standing challenge, we present our method BLADE which accurately recovers perspective parameters from a single image without heuristic assumptions. We start from the inverse relationship between perspective distortion and the person's Z-translation T_z, and we show that T_z can be reliably estimated from the image. We then discuss the important role of T_z for accurate human mesh recovery estimated from close-range images. Finally, we show that, once T_z and the 3D human mesh are estimated, one can accurately recover the focal length and full 3D translation. Extensive experiments on standard benchmarks and real-world close-range images show that our method accurately recovers projection parameters from a single image, and consequently attains state-of-the-art accuracy on both 3D pose estimation and 2D alignment for a wide range of images.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Wang_2025_CVPR, author = {Wang, Shengze and Li, Jiefeng and Li, Tianye and Yuan, Ye and Fuchs, Henry and Nagano, Koki and De Mello, Shalini and Stengel, Michael}, title = {BLADE: Single-view Body Mesh Estimation through Accurate Depth Estimation}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR)}, month = {June}, year = {2025}, pages = {21991-22000} }