This paper presents Query-by-Dancing, a dance music retrieval system that enables a user to retrieve music using dance motions. When dancers search for music to play when dancing, they sometimes find it by referring to online dance videos in which the
dancers use motions similar to their own dance. However, previous music retrieval systems could not support retrieval specialized for dancing because they do not accept dance motions as a query. Therefore, we developed our Query-by-Dancing
system, which uses a video of a dancer (user) as the input query to search a database of dance videos. The query video is recorded using an ordinary RGB camera that does not obtain depth information, like a smartphone camera. The poses and
motions in the query are then analyzed and used to retrieve dance videos with similar poses and motions. The system then enables the user to browse the music attached to the videos it retrieves so that the user can find a piece that is appropriate
for their dancing. An interesting problem here is that a simple search for the most similar videos based on dance motions sometimes includes results that do not match the intended dance genre. We solved this by using a novel measure similar
to tf-idf to weight the importance of dance motions when retrieving videos. We conducted comparative experiments with 4 dance genres and confirmed that the system gained an average of 3 or more evaluation points for 3 dance genres (waack,
pop, break) and that our proposed method was able to deal with different dance genres.
S. Tsuchida, S. Fukayama, M. Goto: Query-by-Dancing: A Dance Music Retrieval System Based on Body-Motion Similarity, Proc. of the 25h International Conference on Multimedia Modeling (MMM2019), pp. 251—263 (Dec. 2018).