Toward Describing Human Gaits by Onomatopoeias

Hirotaka Kato, Takatsugu Hirayama, Yasutomo Kawanishi, Keisuke Doman, Ichiro Ide, Daisuke Deguchi, Hiroshi Murase; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2017, pp. 1573-1580

Abstract


Native Japanese people can distinguish gaits based on their appearances and briefly express them using various onomatopoeias to express their impressions intuitively. It is said that Japanese onomatopoeias have sound-symbolism and their phoneme is strongly related to the impression of a motion. Thus, we considered that if a phonetic space based on sound-symbolism can be associated with the kinetic feature space of gaits, subtle difference of gaits could be expressed as difference in phoneme. This framework is expected to make human-computer interaction more intuitive. In this paper, we propose a method to convert the relative body-parts movements to onomatopoeias using a deep-learning based regression model. Through experiments, we confirmed the effectiveness of the proposed method, and discussed the potential of describing an arbitrary gait by not only existing onomatopoeias but also a novel one.

Related Material


[pdf]
[bibtex]
@InProceedings{Kato_2017_ICCV,
author = {Kato, Hirotaka and Hirayama, Takatsugu and Kawanishi, Yasutomo and Doman, Keisuke and Ide, Ichiro and Deguchi, Daisuke and Murase, Hiroshi},
title = {Toward Describing Human Gaits by Onomatopoeias},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV) Workshops},
month = {Oct},
year = {2017}
}