A new version of PianoBART will be published at PianoBART2.
Article: Xiao Liang, Zijian Zhao, Weichao Zeng, Yutong He, Fupeng He, Yiyi Wang, Chengying Gao*, ” PianoBART: Symbolic Piano Music Generation and Understanding with Large-Scale Pre-Training”, ICME 2024
Some parts of our code borrows from muzic/musicbert at main · microsoft/muzic (github.com) [1] and wazenmai/MIDI-BERT: This is the official repository for the paper, MidiBERT-Piano: Large-scale Pre-training for Symbolic Music Understanding. (github.com) [2].
The datasets utilized in our paper are as follows:
Pretrain: POP1K7, ASAP, POP909, Pianist8, EMOPIA
Generation: Maestro, GiantMidi
Composer Classification: ASAP, Pianist8
Emotion Classification: EMOPIA
Velocity Prediction: GiantMidi
Melody Prediction: POP909
You can generate data using the repositories mentioned in [1] and [2]. The process of organizing the data is the same as described in [2]. Additionally, you can use the --datasets
and --dataroot
options to specify the name and root path of your dataset.
We provide a conda-based environment. To use this environment, please install it using the following command:
conda env create -f environment.yml
This environment has been tested and is working properly.
To run the model, please refer to the code at the bottom of "main.py", which is shown as follows.
if __name__ == '__main__':
pretrain()
#finetune()
#finetune_generation()
#abalation()
You can uncomment the corresponding function to perform the desired task.
Uncomment the “pretrain()” in main.py and run it.
python main.py
Note: Before run the code, please do the following steps to patch the code.
-
Locate the file of
shapesimilarity.py
, which probably is in the path ofyour_env/lib/python{version}/site-packages/shapesimilarity/shapesimilarity.py
. -
Use the patch we provide, simply just run the following command in the terminal.
patch <path of shapesimilarity.py> < patches/shapesimilarity.patch
For example, if you use the conda-based environment we provide, you can run the following command.
Assume the conda environment is located in
~/anaconda3/envs/Bart
.
patch ~/anaconda3/envs/Bart/lib/python3.8/site-packages/shapesimilarity/shapesimilarity.py < patches/shapesimilarity.patch
Uncomment the finetune_generation()
in main.py and run it.
python main.py
Some parameters you may need to change:
--ckpt
: The path of the model you want to load.--datasets
: The name of the dataset you want to use.--dataroot
: The root path of the dataset you want to use.--cuda_devices
: The GPU you want to use.--class_num
: The class amount of the task.
for example, if you want to use the GiantMIDI1k dataset which we used, you can run it with
python main.py --datasets GiantMIDI1k --dataroot Data/output_generate/GiantMIDI1k/gen_method --ckpt <model path> --cuda_devices <GPU ids>
In the following tasks, you can uncomment the finetune()
in main.py and run it.
python main.py --datasets <dataset name> --dataroot <root path of dataset> --class_num <class number> --task composer
python main.py --datasets <dataset name> --dataroot <root path of dataset> --class_num <class number> --task emotion
python main.py --datasets <dataset name> --dataroot <root path of dataset> --class_num <class number> --task velocity
python main.py --datasets <dataset name> --dataroot <root path of dataset> --class_num <class number> --task melody
In this section, you can input an intro (MIDI file) to PianoBart, and it will generate a new MIDI file inspired by the input. Simply provide the intro as input, and PianoBart will use its trained models to generate a new MIDI file with a similar style and tone.
python --ckpt <model path> --input <input path> --output <output path> demo.py
Please note that there is a bug in demo.py that restricts the usage to only one CUDA device or CPU.
You can also use eval_generation.py to generate music in one go, with the output format as numpy.array. However, you must set the batch size as 1 and use only one GPU or CPU.
python --ckpt <model path> --dataset_path <dataset_path> --dataset_name <dataset_name> --output <output path> eval_generation.py
@INPROCEEDINGS{10688332,
author={Liang, Xiao and Zhao, Zijian and Zeng, Weichao and He, Yutong and He, Fupeng and Wang, Yiyi and Gao, Chengying},
booktitle={2024 IEEE International Conference on Multimedia and Expo (ICME)},
title={PianoBART: Symbolic Piano Music Generation and Understanding with Large-Scale Pre-Training},
year={2024},
volume={},
number={},
pages={1-6},
keywords={Codes;Semantics;Music;Transformers;Information leakage;Automatic Music Generation;Music Understanding;Symbolic Piano Music;Bidirectional and Auto-Regressive Transformers (BART)},
doi={10.1109/ICME57554.2024.10688332}}