This article is a technical blog compiled by Lei Feng subtitles group, the original Make Music and Art Using Machine Learning, author magenta.
Translation: Wang Xingyu, Wang Fei, Wei Honggui | Finishing: Fan Jiang
A major goal of the Magenta project is to demonstrate that machine learning can be used to increase everyone's creative potential.
Demonstrations and applications on the page come from Googlers (internal or external). They create interesting toys, creative applications, research notes, and professional tools that will help many people.
This section includes browser-based application management, many of which areTensorFlow.js, Reasoning for WebGL-accelerated.
This is Google Creative Lab basedMusicVAEuseMusicVAE.js APIImplement an interactive demo. You can use it to generate a two-dimensional palette of drum spots and draw paths through the latent space to create constantly changing beats. The four corners can be edited manually, replaced with presets, or sampled from the latent space to regenerate the palette.
Catherine McCurry ( Github:Currycurry)
Zach Schwartz (Github:Zischwartz)
Harold Cooper ( Github:Hrldcpr)
This is Google's Pie Shop based onMusicVAEuseMusicVAE.js APIImplement an interactive demo. Latent Loops allow you to select tunes in a matrix of different scales and generate looping tunes, which are then used to generate a longer tune. Musicians can use this interface to create a complete song and easily move it to their own digital music workstation.
This demo can make you andSketchRNNDraw together.
One based onNSynthIn an attempt to interact with AI, NSynth is done in cooperation with Google Creative Lab. You can choose two instruments to combine them into a new music.
An example of an interactive AI based on NSynth,MelodyRNNIt was done in cooperation with Google Creative Lab, which lets you compose music through machine learning. Train a neural network through many MIDI examples to let it learn music concepts and create a note about music and timing diagrams. You play some notes and see how it reacts.
ColaboratoryIt is a Google research project for popularizing machine learning education and research. The environment is a Jupyter laptop that runs completely in the cloud and does not require local settings.
We have provided Colab notebooks with models that allow you to interact with them on a hosted Google Cloud instance for free.
8. E-Z NSynth [Demo]
This Colab notebook can help you upload your own sound file for free, then useNSynthThe model reconstructs these sound files.
9. MusicVAE [Demo]
MusicVAELearn the potential space for music scores. This Colab notebook provides features that allow you to randomly sample from previous assignments and interpolate between existing sequences of several pre-trained MusicVAE models. You can also compare the results with the benchmark model described in the MusicVAE paper. The title of this paper is:Stratified latent vector model for learning long-term structure of music.
10. Onsets and Frames [Demo]
Onsets and FramesIs an automatic piano music transcription model. This Colab notebook demonstrates the operation of the model on user-supplied recordings.
11. Latent Constraints [Demo]
Run the experiment code for potential constraints:Generation of conditions for unconditional generation of the model.
Native applications run on the local machine and usually require you to install additional software, but sometimes it is more suitable for professionals.
Community contributions were created without Google’s participation, using Magenta models and libraries. If you have a demo that you think belongs here, please go through ourDiscussion groupshare it.
An experimental electronic drum instrument powered by DrumsRNN and MusicVAE of TensorFlow.js and Magenta. To use it, define the seed mode on the left and use the “ Generate ” button. DrumsRNN dreams to continue your seed model. The “ Density & rdquo; slider uses MusicVAE to add or remove clicks from patterns.
Keep a note or chord so that the deep neural network plays arpeggio patterns around it.
Play and hold melody or chords and let the deep neural network do it for you.
A deep neural network that makes melody in your browser.
These melodyes are generated by ImprovRNN, and the chord production conditions are generated by using Markov chains.
mSynthIs a year in 2017Outside HacksIn the first place in the app, Outside Hacks is the official 24-hour music programming marathon at the San Francisco Outside Lands Festival. The team developed an interactive experience for the artist audience. The festival viewers can control Magenta in real time by tilting the phone.NSynth.