From our sponsor: Ready to show your plugin skills? Enter the Penpot Plugins Contest (Nov 15-Dec 15) to win cash prizes!
In this tutorial, you will learn how we at ARKx crafted the audio-reactive visuals for Coala Music’s website. We’ll walk through the concepts and techniques used to synchronize audio frequencies and tempo, creating a dynamic visualizer with procedural particle animations.
Getting Started
We will initialize our Three.js scene only after the user interacts; this way, we can enable the audio to autoplay and avoid the block policy of the main browsers.
export default class App {
constructor() {
this.onClickBinder = () => this.init()
document.addEventListener('click', this.onClickBinder)
}
init() {
document.removeEventListener('click', this.onClickBinder)
//BASIC THREEJS SCENE
this.renderer = new THREE.WebGLRenderer()
this.camera = new THREE.PerspectiveCamera(70, window.innerWidth / window.innerHeight, 0.1, 10000)
this.scene = new THREE.Scene()
}
}
Analyzing Audio Data
Next, we initialize our Audio and BPM Managers. They are responsible for loading the audio, analyzing it, and synchronizing it with the visual elements.
async createManagers() {
App.audioManager = new AudioManager()
await App.audioManager.loadAudioBuffer()
App.bpmManager = new BPMManager()
App.bpmManager.addEventListener('beat', () => {
this.particles.onBPMBeat()
})
await App.bpmManager.detectBPM(App.audioManager.audio.buffer)
}
The AudioManager
class then loads the audio from a URL—we are using a Spotify Preview URL—and analyzes it to break down the audio signals into frequency bins in real-time.
const audioLoader = new THREE.AudioLoader();
audioLoader.load(this.song.url, buffer => {
this.audio.setBuffer(buffer);
})
Frequency Data
We have to segregate the frequency spectrum into low, mid, and high bands to calculate the amplitudes.
To segment the bands, we need to define start and end points (e.g., the low band range starts at the lowFrequency
value and ends at the midFrequency
start value). To get the average amplitude, simply multiply the frequencies by the buffer length, then divide by the sample rate, and normalize it to a 0-1 scale.
this.lowFrequency = 10;
this.frequencyArray = this.audioAnalyser.getFrequencyData();
const lowFreqRangeStart = Math.floor((this.lowFrequency * this.bufferLength) / this.audioContext.sampleRate)
const lowFreqRangeEnd = Math.floor((this.midFrequency * this.bufferLength) / this.audioContext.sampleRate)
const lowAvg = this.normalizeValue(this.calculateAverage(this.frequencyArray, lowFreqRangeStart, lowFreqRangeEnd));
//THE SAME FOR MID AND HIGH
Detecting Tempo
The amplitude of the frequencies isn’t enough to align the music beat with the visual elements. Detecting the BPM (Beats Per Minute) is essential to make the elements react in sync with the pulse of the music. In Coala’s project, we feature many songs from their artists’ label, and we don’t know the tempo of each piece of music. Therefore, we are detecting the BPM asynchronously using the amazing web-audio-beat-detector
module, by simply passing the audioBuffer.
const { bpm } = await guess(audioBuffer);
Dispatching the Signals
After detecting the BPM, we can dispatch the event signal using setInterval
.
this.interval = 60000 / bpm; // Convert BPM to interval
this.intervalId = setInterval(() => {
this.dispatchEvent({ type: 'beat' })
}, this.interval);
Procedural Reactive Particles (The fun part 😎)
Now, we’re going to create our dynamic particles that will soon be responsive to audio signals. Let’s start with two new functions that will create basic geometries (Box
and Cylinder
) with random segments and properties; this approach will result in a unique structure each time.
Next, we’ll add this geometry to a THREE.Points
object with a simple ShaderMaterial
.
const geometry = new THREE.BoxGeometry(1, 1, 1, widthSeg, heightSeg, depthSeg)
const material = new THREE.ShaderMaterial({
side: THREE.DoubleSide,
vertexShader: vertex,
fragmentShader: fragment,
transparent: true,
uniforms: {
size: { value: 2 },
},
})
const pointsMesh = new THREE.Points(geometry, material)
Now, we can begin creating our meshes with random attributes in a specified interval:
Adding noise
We drew inspiration from Akella’s FBO Tutorial and incorporated the curl noise into the vertex shader to create organic, natural-looking movements and add fluid, swirling motions to the particles. I won’t delve deeply into the explanation of Curl Noise and FBO Particles, as Akella did an amazing job in his tutorial. You can check it out to learn more about it.
Animating the particles
To summarize, in the vertex shader, we animate the points to achieve dynamic effects that dictate the particles’ behavior and appearance. Starting with newpos
, which is the original position of each point, we create a target. This target adds curl noise along its normal vector, varying based on frequency and amplitude uniforms. It is interpolated by the power of the distance d
between them. This process creates a smooth transition, easing out as the point approaches the target.
vec3 newpos = position;
vec3 target = position + (normal * .1) + curl(newpos.x * frequency, newpos.y * frequency, newpos.z * frequency) * amplitude;
float d = length(newpos - target) / maxDistance;
newpos = mix(position, target, pow(d, 4.));
We also add a wave motion to newpos.z
, adding an extra layer of liveliness to the animation.
newpos.z += sin(time) * (.1 * offsetGain);
Moreover, the size of each point adjusts dynamically based on how close the point is to its target and its depth in the scene, making the animation feel more three-dimensional.
gl_PointSize = size + (pow(d,3.) * offsetSize) * (1./-mvPosition.z);
Here it is:
Adding Colors
In the fragment shader, we are masking out the point with a circle shape function and interpolating the startColor
and endColor
uniforms according to the point’s vDistance
defined in the vertex:
vec3 circ = vec3(circle(uv,1.));
vec3 color = mix(startColor,endColor,vDistance);
gl_FragColor=vec4(color,circ.r * vDistance);
Bringing Audio and Visuals Together
Now, we can use our creativity to assign the audio data and beat to all the properties, both in the vertex and fragment shader uniforms. We can also add some random animations to the scale, position and rotation using GSAP.
update() {
// Dynamically update amplitude based on the high frequency data from the audio manager
this.material.uniforms.amplitude.value = 0.8 + THREE.MathUtils.mapLinear(App.audioManager.frequencyData.high, 0, 0.6, -0.1, 0.2)
// Update offset gain based on the low frequency data for subtle effect changes
this.material.uniforms.offsetGain.value = App.audioManager.frequencyData.mid * 0.6
// Map low frequency data to a range and use it to increment the time uniform
const t = THREE.MathUtils.mapLinear(App.audioManager.frequencyData.low, 0.6, 1, 0.2, 0.5)
this.time += THREE.MathUtils.clamp(t, 0.2, 0.5) // Clamp the value to ensure it stays within a desired range
this.material.uniforms.time.value = this.time
}
Conclusion
This tutorial has guided you on how to synchronize sound with engaging visual particle effects using Three.js.
Hope you enjoyed it! If you have questions, let me know on Twitter.
Credits
- Coala Music Website by ARKx
- FBO Particles by Yuri Artiukh
- Threejs
- GSAP
- web-audio-beat-detector
- The Book of Shaders
- WebGL Noise by Ashima
- Music by Kendrick Lamar – Money Trees from Spotify API