Visual Rhythm and Beat

Abe Davis, Maneesh Agrawala; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2018, pp. 2532-2535

Abstract


We present a visual analogue for musical rhythm derived from an analysis of motion in video, and show that alignment of visual rhythm with its musical counterpart results in the appearance of dance. Central to our work is the concept of visual beats --- patterns of motion that can be shifted in time to control visual rhythm. By warping visual beats into alignment with musical beats, we can create or manipulate the appearance of dance in video. Using this approach we demonstrate a variety of retargeting applications that control musical synchronization of audio and video: we can change what song performers are dancing to, warp irregular motion into alignment with music so that it appears to be dancing, or search collections of video for moments of accidentally dance-like motion that can be used to synthesize musical performances. (This paper is a workshop preview of Davis et al., SIGGRAPH 2018.)

Related Material


[pdf]
[bibtex]
@InProceedings{Davis_2018_CVPR_Workshops,
author = {Davis, Abe and Agrawala, Maneesh},
title = {Visual Rhythm and Beat},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2018}
}