Text in https://tympanus.net/codrops/2019/10/29/real-time-multiside-refraction-in-three-steps/
The author is Jesper Vos
In this tutorial, you will learn how to use Three.js to make objects look like glass in three steps.
When rendering 3D objects, whether you use some 3D software or WebGL for real-time display, you always have to assign materials to them to make them visible and have the appearance you want.
You can use off the shelf programs in libraries like Three.js to mimic many types of materials, but in this tutorial, I'll show you how to use three objects (three steps) to make objects look like glass.
Step 1: set and front refraction
In this demonstration, I'll use diamond geometry, but you can follow a simple box or any other geometry.
Let's build our project. We need a renderer, a scene, a perspective camera and our geometry. In order to render our geometry, we need to assign materials to it. Creating this material will be the main focus of this tutorial. Therefore, continue to create a new ShaderMaterial with basic vertex and clip shaders.
Contrary to your expectation, our material will not be transparent. In fact, we will sample and deform anything behind the diamond. To do this, we need to render the scene (without the diamond) as a texture. I just use an orthographic camera to render a full screen plane, but this can also be a scene full of other objects. The easiest way to segment background geometry from a diamond in Three.js is to use layer.
this.orthoCamera = new THREE.OrthographicCamera( width / - 2,width / 2, height / 2, height / - 2, 1, 1000 ); // assign the camera to layer 1 (layer 0 is default) this.orthoCamera.layers.set(1); const tex = await loadTexture('texture.jpg'); this.quad = new THREE.Mesh(new THREE.PlaneBufferGeometry(), new THREE.MeshBasicMaterial({map: tex})); this.quad.scale.set(width, height, 1); // also move the plane to layer 1 this.quad.layers.set(1); this.scene.add(this.quad);
Our rendering loop is as follows:
this.envFBO = new THREE.WebGLRenderTarget(width, height); this.renderer.autoClear = false; render() { requestAnimationFrame( this.render ); this.renderer.clear(); // render background to fbo this.renderer.setRenderTarget(this.envFbo); this.renderer.render( this.scene, this.orthoCamera ); // render background to screen this.renderer.setRenderTarget(null); this.renderer.render( this.scene, this.orthoCamera ); this.renderer.clearDepth(); // render geometry to screen this.renderer.render( this.scene, this.camera ); };
Well, now it's time to spend a little bit of theory. Transparent materials, such as glass, can be bent so that they are visible. That's because the propagation of light in glass is slower than that in air, so when the light wave strikes the glass object at a certain angle, the speed change will cause the light wave to change direction. This change of wave direction describes the refraction phenomenon.
To replicate this in the code, we will need to know the angle between our eye vector and the diamond surface (normal) vector in world space. Let's update the vertex shader to compute these vectors.
varying vec3 eyeVector; varying vec3 worldNormal; void main() { vec4 worldPosition = modelMatrix * vec4( position, 1.0); eyeVector = normalize(worldPos.xyz - cameraPosition); worldNormal = normalize( modelViewMatrix * vec4(normal, 0.0)).xyz; gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0); }
In the clip shader, we can now use eyeVector and worldNormal as the first two parameters of the glsl built-in refraction function. The third parameter is the ratio of the refractive index, that is, the refractive index (IOR) of our fast medium (air) divided by the IOR of our slow medium (glass). In this case, the value is 1.0 / 1.5, but you can adjust the value to get the desired result. For example, IOR of water is 1.33, and ior of diamond is 2.42.
uniform sampler2D envMap; uniform vec2 resolution; varying vec3 worldNormal; varying vec3 viewDirection; void main() { // get screen coordinates vec2 uv = gl_FragCoord.xy / resolution; vec3 normal = worldNormal; // calculate refraction and add to the screen coordinates vec3 refracted = refract(eyeVector, normal, 1.0/ior); uv += refracted.xy; // sample the background texture vec4 tex = texture2D(envMap, uv); vec4 output = tex; gl_FragColor = vec4(output.rgb, 1.0); }
Splendid! We successfully wrote a refraction shader. But our diamonds are almost invisible... Part of the reason is that we only deal with one visual property of glass. Not all light passes through the material to be refracted. In fact, some light will be reflected. Let's see how to achieve it!
Step 2: Reflection and Fresnel equation
For simplicity, in this tutorial, we will not calculate the appropriate reflections, but only use white as the reflected light. Now, how do we know when to reflect and when to refract? Theoretically, it depends on the refractive index of the material. When the angle between the incident vector and the surface normal is larger than the critical angle, the light wave will be reflected.
In the clip shader, we will use the Fresnel equation to calculate the ratio of reflected light to refracted light. Unfortunately, glsl does not have this equation built in, but you can copy it here:
float Fresnel(vec3 eyeVector, vec3 worldNormal) { return pow( 1.0 + dot( eyeVector, worldNormal), 3.0 ); }
Now, we can simply mix the refraction texture color with the white reflection color based on the Fresnel ratio just calculated.
uniform sampler2D envMap; uniform vec2 resolution; varying vec3 worldNormal; varying vec3 viewDirection; float Fresnel(vec3 eyeVector, vec3 worldNormal) { return pow( 1.0 + dot( eyeVector, worldNormal), 3.0 ); } void main() { // get screen coordinates vec2 uv = gl_FragCoord.xy / resolution; vec3 normal = worldNormal; // calculate refraction and add to the screen coordinates vec3 refracted = refract(eyeVector, normal, 1.0/ior); uv += refracted.xy; // sample the background texture vec4 tex = texture2D(envMap, uv); vec4 output = tex; // calculate the Fresnel ratio float f = Fresnel(eyeVector, normal); // mix the refraction color and reflection color output.rgb = mix(output.rgb, vec3(1.0), f); gl_FragColor = vec4(output.rgb, 1.0); }
It looks much better, but there are still some shortcomings... Well, we can't see the other side of the transparent object. Let's solve this problem!
Step 3: Multilateral refraction
So far, we've learned about reflection and refraction, and we can understand that light bounces back and forth several times inside an object before leaving it.
In order to get physically correct results, we will have to track each ray, but unfortunately, this calculation is too large to render in real time. So I'm going to show you a simple approximation that at least gives you an intuitive view of the back of our diamond.
In a clip shader, we need the world normals on the front and back of the geometry. Because we can't render both sides at the same time, we need to first render the back normals as textures.
Let's make a new ShaderMaterial as we did in step 1, but this time we'll render the world normal as GL ﹣ fragcolor.
varying vec3 worldNormal; void main() { gl_FragColor = vec4(worldNormal, 1.0); }
Next, we update the render loop to include the back channel.
this.backfaceFbo = new THREE.WebGLRenderTarget(width, height); ... render() { requestAnimationFrame( this.render ); this.renderer.clear(); // render background to fbo this.renderer.setRenderTarget(this.envFbo); this.renderer.render( this.scene, this.orthoCamera ); // render diamond back faces to fbo this.mesh.material = this.backfaceMaterial; this.renderer.setRenderTarget(this.backfaceFbo); this.renderer.clearDepth(); this.renderer.render( this.scene, this.camera ); // render background to screen this.renderer.setRenderTarget(null); this.renderer.render( this.scene, this.orthoCamera ); this.renderer.clearDepth(); // render diamond with refraction material to screen this.mesh.material = this.refractionMaterial; this.renderer.render( this.scene, this.camera ); };
Now we sample the back normal texture in the refracted material.
vec3 backfaceNormal = texture2D(backfaceMap, uv).rgb;
Finally, we combine the front and back normals.
float a = 0.33; vec3 normal = worldNormal * (1.0 - a) - backfaceNormal * a;
In this equation, a is just a scalar value indicating the number of back normals that should be applied.
We did it! We can see all sides of the diamond just because we refract and reflect the material of the diamond.
Limitations
As I've explained, it's impossible to render physically correct transparent materials in real time in this way. Another problem occurs when you render multiple glass objects in front of each other. Because we only sample the environment once, we cannot see through a series of objects. Finally, the screen space refraction I'm demonstrating here doesn't work well near the edge of the canvas because light may be refracted to a value outside its boundary, and we didn't capture that data when rendering the background scene to the render target.
Of course, there are many ways to overcome these limitations, but for real-time rendering in WebGL, they may not be all good solutions.
Carry articles and revise them by machine.
No public numbers need your attention.