From our sponsor: Ready to show your plugin skills? Enter the Penpot Plugins Contest (Nov 15-Dec 15) to win cash prizes!
When rendering a 3D object you’ll always have to assign it a material to make it visible and to give it a desired appearance, whether it is in some kind of 3D software or in real-time with WebGL.
Many types of materials can be mimicked with out-of-the-box programs in libraries like Three.js, but in this tutorial I will show you how to make objects appear glass-like in three steps using—you guessed it—Three.js.
Step 1: Setup and Front Side Refraction
For this demo I’ll be using a diamond geometry, but you can follow along with a simple box or any other geometry.
Let’s set up our project. We’ll need a renderer, a scene, a perspective camera and our geometry. In order to render our geometry we will need to assign it a material. Creating this material will be the main focus of this tutorial. So go ahead and create a new ShaderMaterial with a basic vertex, and fragment shader.
Contrary to what you’d expect, our material will not be transparent, in fact we will sample and distort anything that’s behind our diamond. To do that we will need to render our scene (without the diamond) to a texture. I’m simply rendering a full screen plane with an orthographic camera, but this could just as well be a scene full of other objects. The easiest way to split the background geometry from the diamond in Three.js is to use Layers.
this.orthoCamera = new THREE.OrthographicCamera( width / - 2,width / 2, height / 2, height / - 2, 1, 1000 );
// assign the camera to layer 1 (layer 0 is default)
this.orthoCamera.layers.set(1);
const tex = await loadTexture('texture.jpg');
this.quad = new THREE.Mesh(new THREE.PlaneBufferGeometry(), new THREE.MeshBasicMaterial({map: tex}));
this.quad.scale.set(width, height, 1);
// also move the plane to layer 1
this.quad.layers.set(1);
this.scene.add(this.quad);
Our render loop will look like this:
this.envFBO = new THREE.WebGLRenderTarget(width, height);
this.renderer.autoClear = false;
render() {
requestAnimationFrame( this.render );
this.renderer.clear();
// render background to fbo
this.renderer.setRenderTarget(this.envFbo);
this.renderer.render( this.scene, this.orthoCamera );
// render background to screen
this.renderer.setRenderTarget(null);
this.renderer.render( this.scene, this.orthoCamera );
this.renderer.clearDepth();
// render geometry to screen
this.renderer.render( this.scene, this.camera );
};
Alright, time for a little bit of theory now. Transparent materials like glass are visible because they bend light. That is because light travels slower in glass than it does in air, when a lightwave hits a glass object at an angle, this change in speed causes the wave to change direction. This change in direction of a wave is what describes the phenomenon of refraction.
To replicate this in code we will need to know the angle between our eye vector and the surface (normal) vector of our diamond in world space. Let’s update our vertex shader to calculate these vectors.
varying vec3 eyeVector;
varying vec3 worldNormal;
void main() {
vec4 worldPosition = modelMatrix * vec4( position, 1.0);
eyeVector = normalize(worldPos.xyz - cameraPosition);
worldNormal = normalize( modelViewMatrix * vec4(normal, 0.0)).xyz;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
In our fragment shader we can now use eyeVector and worldNormal as the first two parameters of glsl’s built-in refract function. The third parameter is the ratio of indices of refraction, meaning the index of refraction (IOR) of our fast medium—air—divided by the IOR of our slow medium—glass. In this case that will be 1.0/1.5, but you can tweak this value to achieve your desired result. For example the IOR of water is 1.33 and diamond has an IOR of 2.42.
uniform sampler2D envMap;
uniform vec2 resolution;
varying vec3 worldNormal;
varying vec3 viewDirection;
void main() {
// get screen coordinates
vec2 uv = gl_FragCoord.xy / resolution;
vec3 normal = worldNormal;
// calculate refraction and add to the screen coordinates
vec3 refracted = refract(eyeVector, normal, 1.0/ior);
uv += refracted.xy;
// sample the background texture
vec4 tex = texture2D(envMap, uv);
vec4 output = tex;
gl_FragColor = vec4(output.rgb, 1.0);
}
Nice! We successfully wrote a refraction shader. But our diamond is hardly visible… That is partly because we’ve only handled one visual property of glass. Not all light will pass through the material to be refracted, in fact, part of it will be reflected. Let’s see how we can implement that!
Tiny break: 📬 Want to stay up to date with frontend and trends in web design? Subscribe and get our Collective newsletter twice a tweek.
Step 2: Reflection and the Fresnel equation
For the sake of simplicity, in this tutorial we are not going to calculate proper reflections but just use a white color for our reflected light. Now, how do we know when to reflect and when to refract? In theory this depends on the refractive index of the material, when the angle between the incident vector and the surface normal is greater than the critical angle, the light wave will be reflected.
In our fragment shader we will use the Fresnel equation to calculate the ratio between reflected and refracted rays. Unfortunately, glsl does not have this equation built-in as well, but you can just copy it from here:
float Fresnel(vec3 eyeVector, vec3 worldNormal) {
return pow( 1.0 + dot( eyeVector, worldNormal), 3.0 );
}
We can now simply mix the refracted texture color with our white reflection color based on the Fresnel ratio we just calculated.
uniform sampler2D envMap;
uniform vec2 resolution;
varying vec3 worldNormal;
varying vec3 viewDirection;
float Fresnel(vec3 eyeVector, vec3 worldNormal) {
return pow( 1.0 + dot( eyeVector, worldNormal), 3.0 );
}
void main() {
// get screen coordinates
vec2 uv = gl_FragCoord.xy / resolution;
vec3 normal = worldNormal;
// calculate refraction and add to the screen coordinates
vec3 refracted = refract(eyeVector, normal, 1.0/ior);
uv += refracted.xy;
// sample the background texture
vec4 tex = texture2D(envMap, uv);
vec4 output = tex;
// calculate the Fresnel ratio
float f = Fresnel(eyeVector, normal);
// mix the refraction color and reflection color
output.rgb = mix(output.rgb, vec3(1.0), f);
gl_FragColor = vec4(output.rgb, 1.0);
}
That’s already looking a lot better, but there’s still something off about it… Ah right, we can’t see the other side of transparent object. Let’s fix that!
Step 3: Multiside refraction
With the things we’ve learned so far about reflections and refractions we can understand that light can bounce back and forth a couple times inside the object before exiting it.
To achieve a physically correct result we will have to trace each ray, but unfortunately this computation is way too heavy to render in real-time. So instead, I will show you a simple approximation to at least visualize the back faces of our diamond.
We’ll need the world normals of our geometry’s front and back faces in one fragment shader. Since we cannot render both sides at the same time we’ll need to render the back face normals to a texture first.
Let’s make a new ShaderMaterial like we did in step 1, but this time we will render the world normals to gl_FragColor.
varying vec3 worldNormal;
void main() {
gl_FragColor = vec4(worldNormal, 1.0);
}
Next we’ll update our render loop to include the back face pass.
this.backfaceFbo = new THREE.WebGLRenderTarget(width, height);
...
render() {
requestAnimationFrame( this.render );
this.renderer.clear();
// render background to fbo
this.renderer.setRenderTarget(this.envFbo);
this.renderer.render( this.scene, this.orthoCamera );
// render diamond back faces to fbo
this.mesh.material = this.backfaceMaterial;
this.renderer.setRenderTarget(this.backfaceFbo);
this.renderer.clearDepth();
this.renderer.render( this.scene, this.camera );
// render background to screen
this.renderer.setRenderTarget(null);
this.renderer.render( this.scene, this.orthoCamera );
this.renderer.clearDepth();
// render diamond with refraction material to screen
this.mesh.material = this.refractionMaterial;
this.renderer.render( this.scene, this.camera );
};
Now we sample the back face normal texture in our refraction material.
vec3 backfaceNormal = texture2D(backfaceMap, uv).rgb;
And finally we combine the front and back face normals.
float a = 0.33;
vec3 normal = worldNormal * (1.0 - a) - backfaceNormal * a;
In this equation, a is simply a scalar value indicating how much of the back face’s normal should be applied.
We did it! We can see all sides of our diamond, only because of the refractions and reflections we have applied to its material.
Limitations
As I already explained, it is not quite possible to render physically correct transparent materials in real-time with this method. Another problem occurs when rendering multiple glass objects in front of each other. Since we only sample the environment once we won’t be able to see through a chain of objects. And lastly, a screen space refraction like I demoed here won’t work very well near the edges of the canvas since rays may refract to values outside of its boundaries and we didn’t capture that data when rendering the background scene to the render target.
Of course, there are ways to overcome these limitations, but they might not all be great solutions for your real-time rendering in WebGL.
I hope you enjoyed following along with this demo and you have learned something from it. I’m curious to see what you can do with it! Let me know on Twitter. Also don’t hesitate to ask me anything!
Awesome and very usefull tutorial ! Thank you for sharing your knowledge !
Wow amazing insights – thank you!
Thank for sharing useful information. Could you tell me about the way you’ve been learning webgl such as what the book you read …etc
Thank in advanced !
Wow
thanks for the great article