This session is for anyone who would like to explore music, visuals and creative coding for the web. We’ll demonstrate types of data we can get from digital signal processing using interactive sketches in p5.js and the p5.sound library that builds upon the Web Audio API. We’ll explore various methods to map this data onto meaningful visuals that enhance our experience of music.
##1. Amplitude ### Hello Amplitude | Source Code
##2. Frequency FFT - Fast Fourier Transform ### FFT Spectrum Drag, Drop ‘n Analyze | Source Code
Scaling the FFT ### FFT Scale by Neighbors
##3. Pitch autocorrelation in the time domain to detect fundamental frequency
#4. Musical Timing sync music to timestamped lyrics ### Display Lyrics | Source Code
Visualizaitons with the Spotify Audio Analysis API (formerly Echo Nest API) ### Pre-rendered Analysis (via Echo Nest - Beat + Pitch) | Source Code
Participants may use whatever tools they wish, but the demos in this repo use the following libraries:
p5.js is a JavaScript library that starts with the original goal of Processing, to make coding accessible for artists, designers, educators, and beginners, and reinterprets this for today’s web. * p5js.org * /learn * /reference * github
p5.sound.js is an addon library that brings the Processing approach to the Web Audio API. * p5.sound documentation * github
p5.dom.js is an addon library that helps us manipulate the DOM. * p5.dom documentation
p5.AudioIn - microphone! documentation | source code |
p5.SoundFile - load and play .mp3 / .ogg files. documentation | source code
- loadsound()
creates a SoundFile using a Web Audio API buffer. Use during preload()
, or with a callback, or with drag and drop.
p5.PeakDetect - detect beats and/or onsets within a frequency range documentation } | source code |
p5.Amplitude - Analyze volume (amplitude). documentation | source code
- .getLevel()
returns a Root Mean Square (RMS) amplitude reading, between 0.0 and 1.0, usually peaking at 0.5
- .smooth()
p5.FFT - Analyze amplitude over time / frequency. documentation | source code
- .analyze()
returns amplitude readings from 0-255 in the frequency domain.
- .waveform()
returns amplitude readings from -1 to 1 in the time domain. demo | source
Music included in the demos/repo: - Yacht - Summer Song (Instrumental) - See Mystery Lights Instrumentals Creative Commons BY-NC-SA - Broke For Free - As Colorful As Ever - Layers - Creative Commons BY-NC - Alaclair Ensemble - Twit Journalist - This Is America - Creative Commons BY-SA - Peter Johnston - La ere gymnopedie (Erik Satie) - Best of Breitband Vol1 - Inara George - Q - Sargent Singles Vol 1 Creative Commons BY-NC-SA - For more Creative Commons resources, check out the Free Music Archive’s Guide to Online Audio Resources
Notation * Optical Poem, Oskar Fischinger’s 1938 visualization of Franz Liszt’s “2nd Hungarian Rhapsody” * Notations21 * Piano Phase (Alex Chen) * George & Jonathan * dennis.video, generative video by George ^ * Stephen Malinowski’s Music Animation Machine * Artikulation (Rainer Wehinger / Gyorgy Ligeti) * animatednotation.com * John Whitney * Mark Fell - Skydancer
Interactive * computer.jazz (Yotam Mann, Sarah Rothberg) * Patatap (Jono Brandel) * jeffro / xtal * Incredibox
Audio * Cymatics * Golan Levin, Zach Lieberman, Jaap Blonk, Joan La Barbara (w/ autocorrelation) * Oscillator Art (TK Broderick) * Music Makes You Travel (makio135) * Ripple * Ryoji Ikeda
Data Sonification * Listening to the Data * Listen to Wikipedia * Metadata - Echo Nest’s Map of Musical Styles * Making Music with Tennis Data * Sonifying the Flocking Algorithm (@b2renger)
Musical Form * The Shape of Song (Martin Wattenberg) * Infinite Jukebox (Paul Lamere / Echo Nest)
Lyrics * Solar (Robert Hodgin) * Lyrical Particles (Salem Al-Mansoori)