![]() We use JavaScript as the target language for interaction because, unlike Python, it allows for straightforward prototyping and sharing of interactive UIs. The static files for this demo are located in the part-2-js-interaction directory. The final result is this simple web demo. This part of the tutorial involves porting the trained generative model from PyTorch to TensorFlow.js, and hooking the model up to a simple UI to allow users to interact with the model. The outputs of this part are: (1) a model checkpoint, and (2) serialized inputs and outputs for a test case, which we will use to check correctness of our JavaScript port in the next part. The instructions for this part are embedded in the Colab, and the model takes about an hour to train on Colab's free GPUs. This part of the tutorial involves training a music generative model (Piano Genie) from scratch in PyTorch, which comes in the form of a self-contained Google Colab notebook. At a low-level, the decoder is an LSTM that operates on symbolic music data (i.e., MIDI), and is lightweight enough for real-time performance on mobile CPUs. At interaction time, we replace the encoder with a user performing on the buttons. We train Piano Genie by autoencoding expert piano performances: an encoder maps 88-key piano performances into 8-button "button performances", and a decoder attempts to reconstruct the piano performance from the button performance. Piano Genie allows anyone to improvise on the piano by mapping performances on a miniature 8-button keyboard to realistic performances on a full 88-key piano in real time. The example generative model we will train and deploy is Piano Genie (Donahue et al. This demonstration was prepared by Chris Donahue as part of an ISMIR 2021 tutorial on Designing generative models for interactive co-creation, co-organized by Anna Huang and Jon Gillick. Deploying it in JavaScript for interaction (via TensorFlow.js). ![]() Training a generative model of music in Python (via PyTorch).This tutorial is a start-to-finish demonstration ( click here for result) of building an interactive music co-creation system in two parts: Interactive music co-creation with PyTorch and TensorFlow.js
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |