- [pdf] [supp]
Let the Beat Follow You - Creating Interactive Drum Sounds From Body Rhythm
It is often the case that human body movements include rhythmic patterns. A video camera system that captures these patterns and responds to them with rhythmic sounds or music as these happen could create a unique interactive experience. Creating such an experience is challenging and cannot be achieved with existing methods since it requires a real-time translation of related visual cues into in-rhythm sounds. In this work, we propose a novel learning-based system, called 'InteractiveBeat', which generates an evolving interactive soundtrack for a camera input that captures person's movements. InteractiveBeat infers body skeleton keypoints and translates them into drum rhythms using a series of sequence models. It then implements a conditional drum generation network for generating polyphonic drum sounds based on the rhythms. To guarantee real-time function, these models are integrated into a time-evolving pipeline with update rules. For training and evaluation of InteractiveBeat, in addition to training on well-annotated large-scale dance database, we collected a dataset of in-the-wild videos with people performing movements of various activities that correspond to background music. We evaluate InteractiveBeat in two scenarios: i) laboratory setting, ii) prerecorded videos of movements from in-the-wild videos, and develop 'live' demo prototype of the system. Our results on evaluations show that the system can generate interactive rhythmic drums with higher accuracy than existing methods and achieves non-cumulative latency of 34ms. This allows InteractiveBeat to be synchronized with the video stream and to react to movements in real-time.