Creating an Animated Displaced Sphere with a Custom Three.js Material

Learn how to create an animated, displaced sphere using custom shaders with Three.js and React Three Fiber.

In this tutorial we are going to explore the possibilities of creating a custom material using shaders based on built-in Three.js materials with React Three Fiber.

Overview

We will learn:

  • How to hook into a built-in Three.js material to modify it the way we want
  • How to displace vertices of a mesh to get a stunning shape
  • How to make lighting respect this new shape
  • As a bonus, we’ll learn how to fix the shadows after the geometry displacement

This tutorial requires:

  • Basic knowledge of React, including components, state, and props.
  • Basic understanding of Three.js, including basic concepts of 3D rendering, geometries, and materials.

Ready? Let’s get started!

Step 1: Basic setup

First, we need to set up our project with R3F and create a basic scene with a sphere.

import React, { Suspense } from 'react';
import { OrbitControls } from '@react-three/drei';
import { Canvas } from '@react-three/fiber';

const Experiment = () => {
    return (
        <>
            <mesh>
                <icosahedronGeometry args={[1.3, 200]} />
                <meshPhysicalMaterial
                    roughness={0.56}
		            metalness={0.76}
		            clearcoat={0}
		            ior={2.81}
		            iridescence={0.96}
                />
            </mesh>
            <ambientLight />
            <directionalLight intensity={5} position={[-2, 2, 3.5]} />
        </>
    );
};

const Experience = () => {
    return (
        <div className="canvas-wrapper">
            <Canvas
                camera={{
                    position: [0, 0, 5],
                    fov: 45,
                    near: 0.1,
                    far: 1000,
                }}
                gl={{ alpha: false }}
            >
                <Experiment />
                <OrbitControls />
            </Canvas>
        </div>
    );
};

export default Experience;

In this code:

  • We set up a basic React component Experiment that creates a sphere using icosahedronGeometry and a physical material with some properties.
  • We add ambient and directional lighting to the scene.
  • The Experience component sets up the Canvas with a camera and orbit controls for interaction.

Nothing fancy here, we just created a basic canvas, set up lighting, and added a sphere to the scene. This will serve as the foundation for our further modifications.

Step 2: Refactor to use three-custom-shader-material

MeshPhysicalMaterial is the most advanced built-in material in Three.js. How can we modify the material’s shader without losing its existing functionality?

Traditionally, this is done using the material’s onBeforeCompile method. This method takes the current shaders as strings, which we can replace with our code. It looks something like this:

material.onBeforeCompile = (shader) => {
    shader.uniforms = {
        ...shader.uniforms,
        ...uniforms,
    };

    shader.vertexShader = shader.vertexShader
        .replace(
            '#include <common>',
            `
            #include <common>
            // your code
            `,
        )
        .replace(
            '#include <project_vertex>',
            `
            #include <project_vertex>
            // your code
            `,
        );
};

However, this approach can be inconvenient and unreadable. Instead, we will use the excellent library THREE-CustomShaderMaterial.

import React, { Suspense, useRef } from 'react';
import { OrbitControls } from '@react-three/drei';
import { Canvas } from '@react-three/fiber';
import CustomShaderMaterial from 'three-custom-shader-material';
import { MeshPhysicalMaterial } from 'three';

const Experiment = () => {
    const materialRef = useRef(null);

    return (
        <>
            <mesh>
                <icosahedronGeometry args={[1.3, 200]} />
                <CustomShaderMaterial
                    ref={materialRef}
                    silent
                    baseMaterial={MeshPhysicalMaterial}
                    roughness={0.56}
		            metalness={0.76}
		            clearcoat={0}
		            ior={2.81}
		            iridescence={0.96}
                />
            </mesh>
            <ambientLight />
            <directionalLight intensity={5} position={[-2, 2, 3.5]} />
        </>
    );
};

...

The result is the same, but now we can use the library to rewrite parts of the shader. Create empty vertex.glsl and fragment.glsl files:

// vertex.glsl

void main {
	
}
// fragment.glsl

void main {
	
}

Note: We do not assign values to the variables gl_Position and gl_FragColor directly. Instead, we assign values to the variables expected by the library (later in the tutorial). A complete list of these variables can be found here.

Step 3: Displace the Shape in the Vertex Shader Chunk

Our task is to modify the vertex positions based on some pattern, displacing them from the center in the direction of the normal. We use the fragment shader to visualize this pattern. Since the result needs to be used in the vertex shader, we write the code there and pass the result via varying for visualization.

What we want to do:

  • animate the pattern to go up infinitely
  • distort it with noise function to give it organic random look
  • make fractions of this pattern
  • push the vertices outwards using normals

Let’s start with displaying the pattern in the fragment shader:

// vertex.glsl

uniform float uTime;

varying float vPattern;

//	Classic Perlin 3D Noise 
//	by Stefan Gustavson (https://github.com/stegu/webgl-noise)
//
vec4 permute(vec4 x){return mod(((x*34.0)+1.0)*x, 289.0);}
vec4 taylorInvSqrt(vec4 r){return 1.79284291400159 - 0.85373472095314 * r;}
vec3 fade(vec3 t) {return t*t*t*(t*(t*6.0-15.0)+10.0);}

float cnoise(vec3 P){
  vec3 Pi0 = floor(P); // Integer part for indexing
  vec3 Pi1 = Pi0 + vec3(1.0); // Integer part + 1
  Pi0 = mod(Pi0, 289.0);
  Pi1 = mod(Pi1, 289.0);
  vec3 Pf0 = fract(P); // Fractional part for interpolation
  vec3 Pf1 = Pf0 - vec3(1.0); // Fractional part - 1.0
  vec4 ix = vec4(Pi0.x, Pi1.x, Pi0.x, Pi1.x);
  vec4 iy = vec4(Pi0.yy, Pi1.yy);
  vec4 iz0 = Pi0.zzzz;
  vec4 iz1 = Pi1.zzzz;

  vec4 ixy = permute(permute(ix) + iy);
  vec4 ixy0 = permute(ixy + iz0);
  vec4 ixy1 = permute(ixy + iz1);

  vec4 gx0 = ixy0 / 7.0;
  vec4 gy0 = fract(floor(gx0) / 7.0) - 0.5;
  gx0 = fract(gx0);
  vec4 gz0 = vec4(0.5) - abs(gx0) - abs(gy0);
  vec4 sz0 = step(gz0, vec4(0.0));
  gx0 -= sz0 * (step(0.0, gx0) - 0.5);
  gy0 -= sz0 * (step(0.0, gy0) - 0.5);

  vec4 gx1 = ixy1 / 7.0;
  vec4 gy1 = fract(floor(gx1) / 7.0) - 0.5;
  gx1 = fract(gx1);
  vec4 gz1 = vec4(0.5) - abs(gx1) - abs(gy1);
  vec4 sz1 = step(gz1, vec4(0.0));
  gx1 -= sz1 * (step(0.0, gx1) - 0.5);
  gy1 -= sz1 * (step(0.0, gy1) - 0.5);

  vec3 g000 = vec3(gx0.x,gy0.x,gz0.x);
  vec3 g100 = vec3(gx0.y,gy0.y,gz0.y);
  vec3 g010 = vec3(gx0.z,gy0.z,gz0.z);
  vec3 g110 = vec3(gx0.w,gy0.w,gz0.w);
  vec3 g001 = vec3(gx1.x,gy1.x,gz1.x);
  vec3 g101 = vec3(gx1.y,gy1.y,gz1.y);
  vec3 g011 = vec3(gx1.z,gy1.z,gz1.z);
  vec3 g111 = vec3(gx1.w,gy1.w,gz1.w);

  vec4 norm0 = taylorInvSqrt(vec4(dot(g000, g000), dot(g010, g010), dot(g100, g100), dot(g110, g110)));
  g000 *= norm0.x;
  g010 *= norm0.y;
  g100 *= norm0.z;
  g110 *= norm0.w;
  vec4 norm1 = taylorInvSqrt(vec4(dot(g001, g001), dot(g011, g011), dot(g101, g101), dot(g111, g111)));
  g001 *= norm1.x;
  g011 *= norm1.y;
  g101 *= norm1.z;
  g111 *= norm1.w;

  float n000 = dot(g000, Pf0);
  float n100 = dot(g100, vec3(Pf1.x, Pf0.yz));
  float n010 = dot(g010, vec3(Pf0.x, Pf1.y, Pf0.z));
  float n110 = dot(g110, vec3(Pf1.xy, Pf0.z));
  float n001 = dot(g001, vec3(Pf0.xy, Pf1.z));
  float n101 = dot(g101, vec3(Pf1.x, Pf0.y, Pf1.z));
  float n011 = dot(g011, vec3(Pf0.x, Pf1.yz));
  float n111 = dot(g111, Pf1);

  vec3 fade_xyz = fade(Pf0);
  vec4 n_z = mix(vec4(n000, n100, n010, n110), vec4(n001, n101, n011, n111), fade_xyz.z);
  vec2 n_yz = mix(n_z.xy, n_z.zw, fade_xyz.y);
  float n_xyz = mix(n_yz.x, n_yz.y, fade_xyz.x); 
  return 2.2 * n_xyz;
}

// It's like mod() function, but "smooth" (thank you, captain!), with no immediate jump to 0
float smoothMod(float axis, float amp, float rad) {
    float top = cos(PI * (axis / amp)) * sin(PI * (axis / amp));
    float bottom = pow(sin(PI * (axis / amp)), 2.0) + pow(rad, 2.0);
    float at = atan(top / bottom);
    return amp * (1.0 / 2.0) - (1.0 / PI) * at;
}

// We gonna use this function multiple times
float getDisplacement(vec3 position) {
		// gonna be uniforms later on
		float uFractAmount = 4.;
		float uDisplacementStrength = 0.57;
		float uSpeed = 1.1;

    vec3 pos = position;
    pos.y -= uTime * 0.05 * uSpeed; // pattern changes in time, going up
    pos += cnoise(pos * 1.65) * uNoiseStrength; // position distortion with noise

    return smoothMod(pos.y * uFractAmount, 1., 1.5) * uDisplacementStrength;
}

void main() {
    float pattern = getDisplacement(position);
    vPattern = pattern; // pass the result to the fragment shader
}
// fragment.glsl

varying float vPattern;

void main() {
    vec3 color = vec3(vPattern);

    csm_FragColor = vec4(color, 1.); // Using `csm_FragColor` removes all the shading. Use this only for debugging.
}

The lighter the color, the more the vertex position will be displaced at that point.

Next, we need to reflect our pattern in the vertex shader. We need the direction in which we will shift the position. The normal vector, which is perpendicular to the face, is perfect for this. Using the normal, we will push the vertex position outwards:

// vertex.glsl

...

void main() {
    float pattern = getDisplacement(position);
    vPattern = pattern;
  
    csm_Position += normal * pattern; // move position according to normal
}

Let’s add color:

// Experience.jsx

...

const uniforms = {
    uTime: { value: 0 },
	uColor: { value: new Color('#af00ff') },
};
    
...
// fragment.glsl

varying float vPattern;

uniform vec3 uColor;

void main() {
    vec3 color = vPattern * uColor;

    csm_DiffuseColor = vec4(color, 1.); // Restore shading using `csm_DiffuseColor`
}

It looks pretty nice! But there is an issue. The shading doesn’t seem right; something is off. The problem is that we changed the appearance of the geometry, but the normals remain as if it were a sphere. We need to recalculate them.

Step 4: Fixing the Shading

How do we do that? There’s a so called “neighbours” technique. We need to find two vectors going towards neighbours that are perpendicular to the normal and each other. These are called tangent and bitangent.

Let’s console.log the geometry.attributes:

There is no “tangent” attribute by default. Good news is that Three.js can calculate the tangent for us, and we can calculate the bitangent after. There is a computeTangent() method on all geometries inheriting from BufferGeometry, so let’s use it. Quick note: according to the docs, the computation is only supported for indexed geometries, and if position, normal, and uv are defined. We do have the latter, but our geometry is not indexed right now. We need to fix that. Three.js provides a bunch of utilities, one of which is the mergeVertices function, which converts a non-indexed geometry to an indexed one. Be careful: merging vertices is computationally intensive and may take some time.

Let’s use it and call “computeTangents()” afterwards:

...
import { MeshPhysicalMaterial, Color, IcosahedronGeometry } from 'three';
import { mergeVertices } from 'three/examples/jsm/utils/BufferGeometryUtils';

...

// Refactor to use regular Three.js code instead of R3F primitive for convenience
const geometry = useMemo(() => {
    const geometry = mergeVertices(new IcosahedronGeometry(1.3, 200));
    geometry.computeTangents();

    return geometry;
}, []);

...

return (
    <>
        <mesh geometry={geometry}>
            <CustomShaderMaterial
                ref={materialRef}
                silent
                baseMaterial={MeshPhysicalMaterial}
                roughness={0.56}
		        metalness={0.76}
		        clearcoat={0}
		        ior={2.81}
		        iridescence={0.96}
            />
        </mesh>
        ...
    <>
);

Log the geometry attributes now:

Great, we have the tangent! Now we are ready to calculate new normals.

// vertex.glsl

attribute vec4 tangent; // extract tangent

uniform float uTime;

varying float vPattern;

...

void main() {
    float pattern = getDisplacement(position); // base displacement pattern
    vPattern = pattern; // pass the result to the fragment shader
    
    vec3 biTangent = cross(normal, tangent.xyz);
    float shift = 0.01; // approximate distance to the neighbour
    vec3 posA = position + tangent.xyz * shift; // position of neighbour A
    vec3 posB = position + biTangent * shift; // position of neighbour B

    csm_Position += normal * pattern;
    posA += normal * getDisplacement(posA); // applying displacement to positionA
    posB += normal * getDisplacement(posB); // applying displacement to positionB
		
    // To calculate direction between two vectors,
	// we should subtract the destination from the origin,
	// then normalize it
	// Make sure to use `csm_Position` instead of plain `position`

    vec3 toA = normalize(posA - csm_Position);
    vec3 toB = normalize(posB - csm_Position);

    csm_Normal = normalize(cross(toA, toB)); // recalculated normal
}

And we are done! You can get the final result on github.

Bonus: Fixing the Shadows

After enabling shadows you’ll notice that the shadow cast by the mesh doesn’t respect geometry displacement. It’s still from the basic sphere shape.

We need to use custom depth material that respects our changes to the geometry. Let’s fix that:

<mesh geometry={geometry}>
	<CustomShaderMaterial
		ref={materialRef}
		silent
		baseMaterial={MeshPhysicalMaterial}
		vertexShader={vertexShader}
		fragmentShader={fragmentShader}
		uniforms={uniforms}
		roughness={0.56}
		metalness={0.76}
		clearcoat={0}
		ior={2.81}
		iridescence={0.96}
	/>
	{/* Custom depth material, reusing vertex shader and uniforms */}
	<CustomShaderMaterial
	    ref={depthMaterialRef}
	    baseMaterial={MeshDepthMaterial}
	    vertexShader={vertexShader}
	    uniforms={uniforms}
	    silent
	    depthPacking={RGBADepthPacking}
	    attach="customDepthMaterial"
	/>
</mesh>

That’s it. I added controls for you to have fun with this demo. Enjoy!

Pavel Mazhuga

Creative frontend developer. Passionate about WebGL and related stuff, stunning motion and animations.

Stay in the loop: Get your dose of frontend twice a week

👾 Hey! Looking for the latest in frontend? Twice a week, we'll deliver the freshest frontend news, website inspo, cool code demos, videos and UI animations right to your inbox.

Zero fluff, all quality, to make your Mondays and Thursdays more creative!