Accurate 3D Hand Pose Estimation for Whole-Body 3D Human Mesh Estimation

Gyeongsik Moon, Hongsuk Choi, Kyoung Mu Lee; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2022, pp. 2308-2317

Abstract


Whole-body 3D human mesh estimation aims to reconstruct the 3D human body, hands, and face simultaneously. Although several methods have been proposed, accurate prediction of 3D hands, which consist of 3D wrist and fingers, still remains challenging due to two reasons. First, the human kinematic chain has not been carefully considered when predicting the 3D wrists. Second, previous works utilize body features for the 3D fingers, where the body feature barely contains finger information. To resolve the limitations, we present Hand4Whole, which has two strong points over previous works. First, we design Pose2Pose, a module that utilizes joint features for 3D joint rotations. Using Pose2Pose, Hand4Whole utilizes hand MCP joint features to predict 3D wrists as MCP joints largely contribute to 3D wrist rotations in the human kinematic chain. Second, Hand4Whole discards the body feature when predicting 3D finger rotations. Our Hand4Whole is trained in an end-to-end manner and produces much better 3D hand results than previous whole-body 3D human mesh estimation methods. The codes are available here (https://github.com/mks0601/Hand4Whole_RELEASE).

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Moon_2022_CVPR, author = {Moon, Gyeongsik and Choi, Hongsuk and Lee, Kyoung Mu}, title = {Accurate 3D Hand Pose Estimation for Whole-Body 3D Human Mesh Estimation}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2022}, pages = {2308-2317} }