ref
https://threejs-journey.com/
The note is for personal review , please support the original one made by Bruno Simon who is definitely a brilliant tutor in Three.js and WebGL.
Part 1 Environment map
Loading the textures
There are multiple environment map textures located in the /static/environmentMaps/
folder.
The 0/
, 1/
and 2/
folders contain environment maps taken from the HDRI section of https://polyhaven.com and they have been converted to cube textures using HDRI to CubeMap.
We are going to use the first one in the 0/
folder.
Because these textures are composed of 6 images (like the faces of a cube), we have to use a CubeTextureLoader.
Now we can load the textures. The order is positive x
, negative x
, positive y
, negative y
, positive z
, and negative z
.
Add these parameters after creating the scene
:
import { GLTFLoader } from 'three/examples/jsm/loaders/GLTFLoader.js'/*** Loaders*/
const gltfLoader = new GLTFLoader()/*** Models*/
gltfLoader.load('/models/FlightHelmet/glTF/FlightHelmet.gltf',(gltf) =>{scene.add(gltf.scene)}
)
/*** Environment map*/
// LDR cube texture
const environmentMap = cubeTextureLoader.load(['/environmentMaps/0/px.png','/environmentMaps/0/nx.png','/environmentMaps/0/py.png','/environmentMaps/0/ny.png','/environmentMaps/0/pz.png','/environmentMaps/0/nz.png'
])scene.background = environmentMap
Loading and using the HDRI
import { RGBELoader } from 'three/examples/jsm/loaders/RGBELoader.js'
/*** Loaders*/
// ...
const rgbeLoader = new RGBELoader()
rgbeLoader.load('/environmentMaps/0/2k.hdr', (environmentMap) =>
{environmentMap.mapping = THREE.EquirectangularReflectionMappingscene.background = environmentMapscene.environment = environmentMap
})
AI generated environment map using BlockadeLabs
Let’s continue our AI journey to generate environment maps with Skybox Lab by BlockadeLabs.
I have to warn you, what you see on your screen might be different from what I came up with when I was recording this lesson. The tool is changing very fast and features are being added regularly.
Open Skybox Lab. Since it’s a website, this one should work on every device.
Out of the box, you already get a nice looking environment map. You can drag and drop to move the camera and use the mouse wheel to zoom in and out.
Implementing
As you can see, generated environment maps are equirectangular LDR images. I’m hoping to see HDR in the future, but LDR will do the trick for now.
Comment the HDR (EXR) equirectangular:
// // HDR (EXR) equirectangular
// exrLoader.load('/environmentMaps/nvidiaCanvas-4k.exr', (environmentMap) =>
// {
// environmentMap.mapping = THREE.EquirectangularReflectionMapping// scene.background = environmentMap
// scene.environment = environmentMap
// })
Part 2 Shaders
What is a shader
A shader is a program written in GLSL that is sent to the GPU.
They are used to position each vertex of a geometry and to colorize each visible pixel of that geometry.
The term "pixel" isn't accurate because each point in the render doesn't necessarily match each pixel of the screen and this is why we prefer to use the term "fragment" so don't be surprised if you see both terms.
Then we send a lot of data to the shader such as the vertices coordinates, the mesh transformation, information about the camera and its field of view, parameters like the color, the textures, the lights, the fog, etc. The GPU then processes all of this data following the shader instructions, and our geometry appears in the render.
There are two types of shaders, and we need both of them.
Vertex shader
The vertex shader's purpose is to position the vertices of the geometry.
The idea is to send the vertices positions, the mesh transformations (like its position, rotation, and scale), the camera information (like its position, rotation, and field of view).
Then, the GPU will follow the instructions in the vertex shader to process all of this information in order to project the vertices on a 2D space that will become our render —in other words, our canvas.
When using a vertex shader, its code will be applied on every vertex of the geometry. But some data like the vertex position will change between each vertex. This type of data —the one that changes between vertices— is called an attribute. But some data doesn't need to switch between each vertex like the position of the mesh.
Yes, the location of the mesh will impact all the vertices, but in the same way.
This type of data —the one that doesn't change between vertices— is called a uniform.
We will get back to attributes and uniforms later.
The vertex shader happens first. Once the vertices are placed, the GPU knows what pixels of the geometry are visible and can proceed to the fragment shader.
Fragment shader
The fragment shader purpose is to color each visible fragment of the geometry.
The same fragment shader will be used for every visible fragment of the geometry. We can send data to it like a color by using uniforms —just like the vertex shader, or we can send data from the vertex shader to the fragment shader.
We call this type of data —the one that comes from the vertex shader to the fragment shader— varying.
- The most straightforward instruction in a fragment shader can be to color all the fragments with the same color. We get the equivalent of the MeshBasicMaterial —if we had set only the
color
property. - Or we can send more data to the shader, for instance, a light position. We can then color the fragments according to how much the face is in front of the light source. We would get the MeshPhongMaterial equivalent—if we had one light in the scene.
Summary
The vertex shader position the vertices on the render.
The fragment shader color each visible fragment (or pixel) of that geometry.
The fragment shader is executed after the vertex shader.
Data that changes between each vertices (like their position) is called attribute and can only be used in the vertex shader.
Data that doesn't change between vertices (like the mesh position or a color) is called uniform and can be use in both the vertex shader and the fragment shader.
We can send data from the vertex shader to the fragment shader using varying.
Why writing our own shaders
Three.js materials try to cover as many situations as possible, but they have limitations. If we want to break those limits, we have to write our own shaders.
It can also be for performance reasons. Materials like MeshStandardMaterial are very elaborate and involve a lot of code and calculations. If we write our own shader, we can keep the features and calculations to the minimum. We have more control over the performance.
Writing our own shader is also an excellent way to add post-process to our render, but we will see this in a dedicated lesson.
Once you master the shaders, they'll become a must in all your projects.
Create our first shaders with RawShaderMaterial
To create our first shader, we need to create a particular material. This material can be a ShaderMaterial or a RawShaderMaterial. The difference between these two is that the ShaderMaterial will have some code automatically added to the shader codes while the RawShaderMaterial, as its name suggests, will have nothing.
We will start with the RawShaderMaterial to better understand what's happening.
As we said earlier, we need to provide both the vertex and the fragment shader. You can do this with the vertexShader
and fragmentShader
properties:
const material = new THREE.RawShaderMaterial({vertexShader: '',fragmentShader: ''
})
The problem with that technique is that simple quotes can contain only one line inside —double-quotes too. Our shaders —as simple as they are at the start, will be too long to be written on one line.
A reliable solution is to use back quotes —also called backtick, acute or left quote. Most modern browsers support them. This technique is called template literals, and we can use line breaks in it.
The key or shortcut to write this character depends on your keyboard. Here's a thread on the subject to help you: https://superuser.com/questions/254076/how-do-i-type-the-tick-and-backtick-characters-on-windows/879277
Once you've found the key, change your simple quotes with back quotes:
const material = new THREE.RawShaderMaterial({vertexShader: `uniform mat4 projectionMatrix;uniform mat4 viewMatrix;uniform mat4 modelMatrix;attribute vec3 position;void main(){gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);}`,fragmentShader: `precision mediump float;void main(){gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);}`
})
You should get a red plane. Congratulations, you might not yet understand what's written here, but it's your first shader and a good start for an incredible journey.
Separate the shaders in different files
The back quotes are an excellent solution for small code, and we will use it in future lessons with shaders, but we are missing syntax coloration. Once we have multiple shaders with a lot of code in it, our script will become unbearable. Having a good and comfortable setup is essential.
Shader files
We are going to move the code into separate files. First, move the vertex shader code and the fragment shader codes respectively in and ./src/shaders/test/vertex.glsl/src/shaders/test/fragment.glsl
Even if we will have only one shader in our project, it's a healthy habit to separate and organize our code as best as possible. Consequential projects can have dozens of custom shaders.
Unless your code editor already supports glsl, the syntax coloration probably doesn't work for those two new files. To add the syntax coloration, if you are using VSCode, go to your plugins, search for , and install the plugin. If you are using another code editor, look for compatible plugins and keep an eye on the popularity and the reviews.shaderShader languages support for VS Code
Once installed, you should have a nice syntax coloration on the files. If not, try to restart your code editor .glsl
Syntax coloration is cool, but having a linter is even better. A linter will validate your code and find potential errors while you are writing it. It can be really useful to avoid basic mistakes without having to test the result on the browser.
We won't use one in the following lessons because installing it can be hard but if you want to give it a try, I recommend you watching this video in the Lewis Lepton Youtube Channel: https://www.youtube.com/watch?v=NQ-g6v7GtoI
The linter will also produce errors on incomplete shaders which is a problem because we are going to write partial shaders a little later in the lesson. It's up to you, but you can give it a try.
Import
The good news is that there are already solutions to add support to shaders in Vite:
- vite-plugin-glsl
- vite-plugin-glslify
This gets handy in 3 situations:
- When we have huge shaders and want to split it in smaller files
- When we have the same chunk of code in multiple shaders and want to be able to change it from one file
- When we want to use external shader chunks made by other developers
Both vite-plugin-glsl and vite-plugin-glslify can do that, but with a different syntax. GLSLIFY is kind of the standard way these days, but the writer-Bruno Simon found vite-plugin-glsl easier to use, which is why we are going to go for it over vite-plugin-glslify. Also note that the plugin is well maintained.
Still, if at some points you need to implement vite-plugin-glslify, the process is exactly the same and you should be able to do it on your own.
To install vite-plugin-glsl, in the terminal, close the server and run npm install vite-plugin-glsl
Then go to the file and, at the top, import from :vite.config.js
glsl vite-plugin-glsl
If you log and , you'll get the shader codes as a plain string
const material = new THREE.RawShaderMaterial({vertexShader: testVertexShader,fragmentShader: testFragmentShader
})
Properties
Most of the common properties we've covered with other materials are still available for the RawShaderMaterial
wireframe
side
transparent
flatShading
const material = new THREE.RawShaderMaterial({vertexShader: testVertexShader,fragmentShader: testFragmentShader,wireframe: true
})
But properties like map ,alphaMap ,opacity ,color , etc. won't work anymore because we need to write these features in the shaders ourselves.
GLSL
The language used to code the shaders is called GLSL and stands for OpenGL Shading Language. It's close to C language. Let's learn the basics of its syntax.
Logging
There is no console and, thus, no way to log values. That is due to the code being executed for every vertex and every fragment. It would make no sense to log one value.
Indentation
The indentation is not essential. You can indent as you like.
Semicolon
The semicolon is required to end any instruction. Forgetting even one semicolon will probably result in a compilation error, and the whole material won't work.
Variables
It's a typed language, meaning that we must specify a variable's type, and we cannot assign any other type to that variable.
float fooBar = 0.123;
float foo = - 0.123;
float bar = 1.0;
float c = a / b;int a = 1;
int b = 2;
int c = a * b;float a = 1.0;
int b = 2;
float c = a * float(b);bool foo = true;
bool bar = false;//store values like 2 coordinates with and properties, we can usevec2 foo = vec2(1.0, 2.0);//An empty will result in an error//We can change these properties after creating itvec2 foo = vec2(0.0 );
foo.x = 1.0;
foo.y = 2.0;vec2 foo = vec2(1.0, 2.0);
foo *= 2.0;//vec3 is just like , but with a third property named . It's very convenient when one needs 3D coordinates : vec2 zvec3 foo = vec3(0.0);
vec3 bar = vec3(1.0, 2.0, 3.0);
bar.z = 4.0;//While we can use x,y,z and we can also work with r, g and b. This is just syntax sugar and the result is exactly the same. It's very effective when we use a to store colorsvec3 purpleColor = vec3(0.0);
purpleColor.r = 0.5;
purpleColor.b = 1.0;vec2 foo = vec2(1.0, 2.0);
vec3 bar = vec3(foo, 3.0);vec3 foo = vec3(1.0, 2.0, 3.0);
vec2 bar = foo.xy;//This is called a swizzle and we can also use the properties in a different order:vec3 foo = vec3(1.0, 2.0, 3.0);
vec2 bar = foo.yx;//Finally, the works like it's two predecessors but with a fourth value named a or w — because there is no letter after in the alphabet and for "alpha"vec4 foo = vec4(1.0, 2.0, 3.0, 4.0);
vec4 bar = vec4(foo.zw, vec2(5.0, 6.0));
Functions
Just like in most programming languages, we can create and use functions.
float add(float a, float b)
{return a + b;
}
Shaderific documentation
Shaderific for OpenGL
Shaderific is an iOS application that lets you play with GLSL. The application is not something to care about, but the documentation isn't too bad.
Kronos Group OpenGL reference pages
OpenGL 4.x Reference Pages
This documentation deals with OpenGL, but most of the standard functions you'll see will be compatible with WebGL. Let's not forget that WebGL is just a JavaScript API to access OpenGL.
Book of shaders documentation
The Book of Shaders
The book of shaders mainly focus on fragment shaders and has nothing to do with Three.js but it is a great resource to learn and it has its own glossary.
Understanding the vertex shader
Keep in mind that the vertex shader purpose is to position each vertex of the geometry on the render 2D space. In other words, the vertex shader will convert the 3D vertices coordinates to our 2D canvas coordinates.
Main function
void main()
{
}
This function will be called automatically.
gl_Position
This variable will contain the position of the vertex on the screen. The goal of the instructions in the function is to set this variable properly.
void main()
{gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);gl_Position.x += 0.5;gl_Position.y += 0.5;
}
The plane should move on the top right corner. But be careful; we didn't truly move the plane in a 3D space as if we were playing with the in Three.js. We did move the projected plane on a 2D space.
Think of it like a drawing you did on a paper. In this drawing, you have respected the perspective with vanishing points. Then, you move the whole picture to the top right corner of your desk. The perspective didn't change inside the drawing.
You're probably wondering why we need 4 values for the if its final goal is to position vertices on a 2D space. It's actually because of the coordinates or not precisely in 2D space; they are in what we call clip space which needs 4 dimensions.
A clip space is a space that goes in all 3 directions (x, y, and z) in a range from -1 to +1. It's like positioning everything in a 3D box. Anything out of this range will be "clipped" and disappear. The fourth value (w) is responsible for the perspective.
Position attributes
Attributes are the only variable that will change between the vertices.
The same vertex shader will be applied for each vertex and the attribute will contain the x, y, and z coordinates of that specific vertex.
gl_Position = /* ... */ vec4(position, 1.0);
Matrices uniforms
There are 3 matrices in our code, and because their values are the same for all the vertices of the geometry, we retrieve them by using uniforms.
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;
Each matrix will do a part of the transformation:
- The will apply all transformations relative to the Mesh. If we scale, rotate or move the Mesh, these transformations will be contained in the and applied to the .
modelMatrix
modelMatrix
position
- The will apply transformations relative to the camera. If we rotate the camera to the left, the vertices should be on the right. If we move the camera in direction of the Mesh, the vertices should get bigger, etc.
viewMatrix
- The will finally transform our coordinates into the final clip space coordinates.
projectionMatrix
If you want to find out more about those matrices and coordinates, here's a good article: LearnOpenGL - Coordinate Systems.
To apply a matrix, we multiply it. If want to apply a mat4
to a variable , this variable has to be a vec4
. We can also multiply matrices with other matrices:
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;attribute vec3 position;void main()
{vec4 modelPosition = modelMatrix * vec4(position, 1.0);vec4 viewPosition = viewMatrix * modelPosition;vec4 projectedPosition = projectionMatrix * viewPosition;gl_Position = projectedPosition;//gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
Or we can do cooler things such as transforming our plane wave:
void main()
{vec4 modelPosition = modelMatrix * vec4(position, 1.0);modelPosition.z += sin(modelPosition.x * 10.0) * 0.1;// ...
}
We changed the by using the coordinate through a function. Good luck getting this result with Three.js built-in materials.
Understanding the fragment shader
Main function
void main()
{
}
Precision
We also have an instruction at the top of the code
precision mediump float;
This instruction lets us decide how precise can a be. There are different possible values:float
highp
mediump
lowp
highp
can have performance hit and might not even work on some devices. lowp
can create bugs by the lack of precision. We ordinarily use m
ediump
.
We also could have set the precision for the vertex shader but it's not required.
This part is automatically handled when we are using ShaderMaterial instead of RawShaderMaterial.
gl_FragColor
It's a with the first three values being the red, green, and blue channels (r, g, b) and the fourth value being the alpha (a)
gl_FragColor = vec4(0.5, 0.0, 1.0, 1.0);
If we want to set an alpha below , we also need to set the property to in the RawShaderMaterial
const material = new THREE.RawShaderMaterial({vertexShader: testVertexShader,fragmentShader: testFragmentShader,transparent: true
})
Attributes
Attributes are values that change between each vertex. We already have one attribute named position that contains a vec3 of the coordinates of each vertex.
Let's get back to the JavaScript and create a Float32Array of the right size right after creating the geometry. To know how much vertices we have in the geometry, we can use the already existing attribute:position.
const count = geometry.attributes.position.count
const randoms = new Float32Array(count)for(let i = 0; i < count; i++)
{randoms[i] = Math.random()
}geometry.setAttribute('aRandom', new THREE.BufferAttribute(randoms, 1))
The first parameter of setAttribute(...) is the name of the attribute. That is the name we will use in the shader. We can choose any name but it's good practice to prefix with for "attribute".
The first parameter of BufferAttribute is the data array and the second parameter is how many values compose one attribute. If we were to send a position, we would use 3 because positions are composed of 3 values (x, y and z). But here, it's just 1 random value per vertex so we use 1.
// ...
attribute float aRandom;void main()
{// ...modelPosition.z += aRandom * 0.1;// ...
}
Now you get a plane composed of random spikes.
Varyings
We now want to color the fragments also with the attribute aRandom.
Unfortunately, we cannot use attributes directly in the fragment shader.
Fortunately, there is a way of sending data from the vertex shader to the fragment shader called varyings.
In the vertex shader, we need to create the varying before the function main
. We will call our varying vRandom
varying float vRandom;void main()
{// ...vRandom = aRandom;
}
Finally, we get the varying value in the fragment shader with the same declaration, and we use it as we want in the function
precision mediump float;varying float vRandom;void main()
{gl_FragColor = vec4(0.5, vRandom, 1.0, 1.0);
}
Uniforms
Uniforms are a way to send data from the JavaScript to the shader.
That can be valuable if we want to use the same shader but with different parameters, and it's also the occasion to have parameters that can change during the experience.
We can use uniforms with both vertex and fragment shaders, and the data will be the same for every vertex and every fragment. We already have uniforms in our code with projectionMatrix,
viewMatrix,modelMatrix
but we didn't create these. Three.js did.
To add uniforms to our material , use the property uniforms. We are going to make our plane wave and we want to control the waves frequency
const material = new THREE.RawShaderMaterial({vertexShader: testVertexShader,fragmentShader: testFragmentShader,uniforms:{frequency: { value: 10 }}
})
Here, the name of the uniform we chose is . While it's not mandatory, it's considered a good practice to prefix with the letter to distinguish "uniforms" from other data.frequencyu
Change the name of the uniform to :uFrequency
const material = new THREE.RawShaderMaterial({vertexShader: testVertexShader,fragmentShader: testFragmentShader,uniforms:{uFrequency: { value: 10 }}
})
We can now retrieve the value in our shader code and use it in our function:main
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;
uniform float uFrequency;attribute vec3 position;void main()
{// ...modelPosition.z += sin(modelPosition.x * uFrequency) * 0.1;// ...
}
The result is the same, but we can now control the frequency from the JavaScript.
Let's change our frequency to a to control waves horizontally and vertically. We simply use a Three.js Vector2
const material = new THREE.RawShaderMaterial({vertexShader: testVertexShader,fragmentShader: testFragmentShader,uniforms:{uFrequency: { value: new THREE.Vector2(10, 5) }}
})
In our shader, we change the float to vec2, and we apply the displacement on the axis z by using the axis y too:
// ...
uniform vec2 uFrequency;// ...void main()
{// ...modelPosition.z += sin(modelPosition.x * uFrequency.x) * 0.1;modelPosition.z += sin(modelPosition.y * uFrequency.y) * 0.1;// ...
}
Take your time on this one. It's easy to make mistakes.
Because those values are now controlled in the JavaScript, we can add them to our lil-gui:
gui.add(material.uniforms.uFrequency.value, 'x').min(0).max(20).step(0.01).name('frequencyX')
gui.add(material.uniforms.uFrequency.value, 'y').min(0).max(20).step(0.01).name('frequencyY')
Let's add a new uniform to animate our plane like a flag in the wind. We send a time value to the shader by using a uniform and we use this value inside the function. First, update the material to add the uniform
const material = new THREE.RawShaderMaterial({vertexShader: testVertexShader,fragmentShader: testFragmentShader,uniforms:{uFrequency: { value: new THREE.Vector2(10, 5) },uTime: { value: 0 }}
})
Then, update this uniform in the function. To do so, use the function from the Clock to know how much time passed
const tick = () =>
{const elapsedTime = clock.getElapsedTime()// Update materialmaterial.uniforms.uTime.value = elapsedTime// ...
}
Finally, we get the uniform value in our vertex shader, and we use it in the two functions
// ...
uniform float uTime;// ...void main()
{// ...modelPosition.z += sin(modelPosition.x * uFrequency.x + uTime) * 0.1;modelPosition.z += sin(modelPosition.y * uFrequency.y + uTime) * 0.1;// ...
}
Finally, we get the uniform value in our vertex shader, and we use it in the two functions
// ...
uniform float uTime;// ...void main()
{// ...modelPosition.z += sin(modelPosition.x * uFrequency.x + uTime) * 0.1;modelPosition.z += sin(modelPosition.y * uFrequency.y + uTime) * 0.1;// ...
}
Uniforms are also available in the fragment shader. Let's add a new uniform to control the color.
const material = new THREE.RawShaderMaterial({vertexShader: testVertexShader,fragmentShader: testFragmentShader,uniforms:{uFrequency: { value: new THREE.Vector2(10, 5) },uTime: { value: 0 },uColor: { value: new THREE.Color('orange') }}
})
Then, in our fragment shader, we retrieve the value, and we use it inside our :gl_FragColor
precision mediump float;uniform vec3 uColor;void main()
{gl_FragColor = vec4(uColor, 1.0);
}
Textures
const flagTexture = textureLoader.load('/textures/flag-french.jpg')
Then we can send the texture as a uniform.
const material = new THREE.RawShaderMaterial({// ...uniforms:{// ...uTexture: { value: flagTexture }}
})
While it's tempting to send it to the fragment shader immediately, we have a problem.
To take fragment colors from a texture and apply them in the fragment shader, we must use the function texture2D(...)
. The first parameter of texture2D(...)
is the texture (easy, it's our uTexture
), but the second parameter consists of the coordinates of where to pick the color on that texture, and we don't have these coordinates yet.
That information should sound familiar. We are looking for coordinates that should help us project a texture on a geometry. We are talking about UV coordinates.
The PlaneGeometry automatically generates these coordinates, and we can see that if we log .geometry.attributes.uv
console.log(geometry.attributes.uv)
Because it's an attribute, we can retrieve it in the vertex shader
attribute vec2 uv;
To send data from the vertex shader to the fragment shader, we need to create a varying
. We are going to call vUv
that varying and update its value in the function:main
// ...
attribute vec2 uv;varying vec2 vUv;void main()
{// ...vUv = uv;
}
We can now retrieve the varying vUv in the fragment shader, retrieve the uniform uTexture and eventually get the fragment color with texture2D(...)
precision mediump float;uniform vec3 uColor;
uniform sampler2D uTexture;varying vec2 vUv;void main()
{vec4 textureColor = texture2D(uTexture, vUv);gl_FragColor = textureColor;
}
Color variations
First, in the vertex shader, we are going to store the wind elevation in a variable:
void main()
{// ...float elevation = sin(modelPosition.x * uFrequency.x - uTime) * 0.1;elevation += sin(modelPosition.y * uFrequency.y - uTime) * 0.1;modelPosition.z += elevation;// ...
}
Then, we send the elevation to the fragment by using a varying:
// ...
varying float vElevation;void main()
{// ...vElevation = elevation;
}
Finally, we retrieve the varying vElevation
in our fragment shader, and use it to change the r
,g
, and :b
properties of our textureColor
// ...
varying float vElevation;void main()
{vec4 textureColor = texture2D(uTexture, vUv);textureColor.rgb *= vElevation * 2.0 + 0.5;gl_FragColor = textureColor;
}
Go further
Once we understand the basics of the shaders, it's all about practice. Your first shaders will take countless hours, but you'll learn techniques that you'll frequently use.
In the next lessons, we will practice those techniques and even learn how to draw shapes in shaders, but if you want to go further, here are some links:
- The Book of Shaders: https://thebookofshaders.com/
- ShaderToy: https://www.shadertoy.com/
- The Art of Code Youtube Channel: https://www.youtube.com/channel/UCcAlTqd9zID6aNX3TzwxJXg
- Lewis Lepton Youtube Channel: https://www.youtube.com/channel/UC8Wzk_R1GoPkPqLo-obU_kQ