Music FaderNets

Music FaderNets is a controllable MIDI generation framework that models high-level musical qualities, such as emotional attributes like arousal. Drawing inspiration from the concept of sliding faders on a mixing console, the model offers intuitive and continuous control over these characteristics. Given an input MIDI, Music FaderNets can produce multiple variations with different levels of arousal, adjusted according to the position of the fader.

Year: 2020

Website: https://music-fadernets.github.io/

Input types: MIDI

Output types: MIDI

Output length:

AI Technique: VAE

Dataset: VGMIDI, Yamaha Piano-e-Competition

License type: MIT

Real time:

Free:

Open source:

Checkpoints:

Fine-tune:

Train from scratch:

#MIDI #open-source #free #checkpoints

Guide to using the model

Code accompanying ISMIR 2020 paper - "Music FaderNets: Controllable Music Generation Based On High-Level Features via Low-Level Feature Modelling" can be found on GitHub: https://github.com/gudgud96/music-fader-nets

Edit