SymponyNet is an open-source project aiming to generate complex multi-track and multi-instrument music like symphony.
Our method is fully compatible with other types of music like pop, piano, solo music, etc.
Have fun with SymphonyNet!
We highly recommend users to run this project under conda
environment. Conda will setup an isolated environment where you can install libraries that will not effect the overall system.
After you install miniconda
(or the larger anaconda
), create an environment:
conda create -n sym_net python=3.7
conda activate sym_net
git clone ...SymphonyNet.git
cd SymphonyNet
When you install the python libraries, there are a number of C++ libraries and build dependencies your system will need installed. You can run the following to try to get all the needed bits installed (these commands require things like homebrew
which you should hopefully already have installed):
make setup_osx
or
make setup_linux
If you want to try to install the dependencies yourself, or you likely already have them, you can try to just install the python libraries:
pip install -r requirements.txt
Note: The reason for using this convoluted process is we found the pytorch-fast-transformers
package needs to be built using an installed version of torch
, but directly install with pip requirements will not fully install torch before fast transformers needs the library. This will cause pytorch-fast-transformers
to fail because torch is installed by pip - round and round.
Building pytorch-fast-transformers
takes a while.
Once everything installs correctly, you can run make test_run
, and you should get an output file of output.mid
in the project root directory. By default that is using the CPU not GPUs.
Put your midi files into data/midis/
Run python3 src/preprocess/preprocess_midi.py
under project root path.
Note: The preprocess_midi.py
multi-process all the Midis and convert them into a raw_corpus.txt
file. In this file, each line of encoded text represents a full song.
- Run
python3 src/preprocess/get_bpe_data.py
if you want to train the model with Music BPE. More details about fast BPE implementation could be found hereMusic BPE
. - Set
BPE=1
inconfig.sh
file
Note: We only provide music_bpe_exec
file for linux system usage, if you are using MacOS or Windows, please re-compile the music_bpe_exec
file here
by following the instruction.
Run python3 src/fairseq/make_data.py
to convert the raw_corpus.txt
into binary file for fairseq and create four vocabularies
mentioned in the paper.
Run sh train_linear_chord.sh
to train your own model.
- Put your checkpoint file into
ckpt/
. You can download our pretrained model here - Run
python3 src/fairseq/gen_batch.py test.mid 5 0 1
to generate one symphony MIDI conditioned on the first 5 measures oftest.mid
, with no constraints of chord progression. - Or replace
test.mid
with your own prime MIDI and set how many measures of chords from the prime MIDI you may want to keep. - We provide a Google Colab file
play_symphonynet.ipynb
, where you could follow the generation guide.
SymphonyNet is released under the MIT license