This is the official repository accompanying the paper "Feel The Music: Automatically Generating A Dance For An Input Song", by Purva Tendulkar, Abhishek Das, Aniruddha Kembhavi & Devi Parikh.
Full text available at: https://arxiv.org/abs/2006.11905
Create a new Python 3.7 virtual environment. Install the requirements using
pip install -r requirements.txt
-
git clone https://github.com/purvaten/feel-the-music.git
-
cd feel-the-music
-
Generate dance (example below)
python generate_dance.py \
--songpath './audio_files/flutesong.mp3' \
--songname 'flutesong' \
--steps 100 \
--type "action" \
--visfolder './vis_num_steps_20/dancing_person_20'
A folder named plots
will be created in the current directory containing frames of the dance and the final combined output as <songname>.mp4
. The music and dance matrices will be saved as music.png
and dance.png
in the current directory.
NOTE : <visfolder>
should contain GRID_SIZE
number of images of the agent, smoothly transitioning into each other with the files numbered as 1.png
, 2.png
, ... <GRID_SIZE>.png
. In our experiments GRID_SIZE=20
.
Song | Type | Number of Steps | Agent | Video |
---|---|---|---|---|
flutesong | action | 100 | Stick figure | |
Oe oe oe oa | action | 50 | Stretchy leaves | |
It's the time to disco (karaoke) | action | 100 | Floating leaves |
For more examples, check https://sites.google.com/view/dancing-agents.