Style-ERD: Responsive and Coherent Online Motion Style Transfer

Tianxin Tao, Xiaohang Zhan, Zhongquan Chen, Michiel van de Panne; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022, pp. 6593-6603

Abstract


Motion style transfer is a common method for enriching character animation. Motion style transfer algorithms are often designed for offline settings where motions are processed in segments. However, for online animation applications, such as real-time avatar animation from motion capture, motions need to be processed as a stream with minimal latency. In this work, we realize a flexible, high-quality motion style transfer method for this setting. We propose a novel style transfer model, Style-ERD, to stylize motions in an online manner with an Encoder-Recurrent-Decoder structure, along with a novel discriminator that combines feature attention and temporal attention. Our method stylizes motions into multiple target styles with a unified model. Although our method targets online settings, it outperforms previous offline methods in motion realism and style expressiveness and provides significant gains in runtime efficiency.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Tao_2022_CVPR, author = {Tao, Tianxin and Zhan, Xiaohang and Chen, Zhongquan and van de Panne, Michiel}, title = {Style-ERD: Responsive and Coherent Online Motion Style Transfer}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {6593-6603} }