🚂

Three.js

 
ShowCases:
Here's a preview of what you can achieve with Three.js:
 
Running on web browser Node.jsElectron.js, or React Native.
 
WebGL is a JavaScript API, or programmable interface, for drawing interactive 2D and 3D graphics in web pages. WebGL connects your web browser up to your device’s graphics card, providing you with far more graphical processing power than is available on a traditional
 
 
notion image
 
JavaScript
import {
  BoxBufferGeometry,
  Color,
  Mesh,
  MeshBasicMaterial,
  PerspectiveCamera,
  Scene,
  WebGLRenderer,
} from 'three';

// Get a reference to the container element that will hold our scene
const container = document.querySelector('#scene-container');

// create a Scene
const scene = new Scene();

// Set the background color
scene.background = new Color('skyblue');

/*******
 ******* Create a camera
 ****** "projections" converts to human version friendly format
 ****** PerspectiveCamera is 3D equivalent of a camera in the real world
 ****** Another OrthographicCamera
*******/

/* Viewing frustum
* fov, or field of view: how wide the camera’s view is, in degrees.
* aspect, or aspect ratio: the ratio of the scene’s width to its height.
* near, or near clipping plane: anything closer to the camera than this 
will be invisible.
* far, or far clipping plane: anything further away from the camera 
than this will be invisible.
*/
const fov = 35; // AKA Field of View
const aspect = container.clientWidth / container.clientHeight;
const near = 0.1; // the near clipping plane
const far = 100; // the far clipping plane

const camera = new PerspectiveCamera(fov, aspect, near, far);

// every object is initially created at ( 0, 0, 0 )
// move the camera back so we can view the scene
// set camera.position.x=1;
camera.position.set(0, 0, 10);

// create a geometry
// defined the shape of the mesh
// length width depth
const geometry = new BoxBufferGeometry(2, 2, 2);

// create a default (white) Basic material
// define the surface
const material = new MeshBasicMaterial();

/****
****create a Mesh containing the geometry and material
****/
const cube = new Mesh(geometry, material);

// add the mesh to the scene
scene.add(cube);
// or, scene.remove(mesh)


/****
**** create the renderer
**** WebGLRenderer uses WebGL2 or fall back to WebGL V1
****/
const renderer = new WebGLRenderer();

// next, set the renderer to the same size as our container element
renderer.setSize(container.clientWidth, container.clientHeight);

// finally, set the pixel ratio so that our scene will look good on HiDPI displays
renderer.setPixelRatio(window.devicePixelRatio);

// add the automatically created <canvas> element to the page
container.append(renderer.domElement);

// render, or 'create a still image', of the scene
renderer.render(scene, camera);
 
notion image
notion image
notion image
notion image
The four parameters we pass into the PerspectiveCamera constructor each create one aspect of the frustum:
  1. The field of view defines the angle at which the frustum expands. A small field of view will create a narrow frustum, and a wide field of view will create a wide frustum.
  1. The aspect ratio matches the frustum to the scene container element. When we set this to the container’s width divided by its height, we ensure the rectangular base of the frustum can be expanded to fit perfectly into the container. If we get this value wrong the scene will look stretched and blurred.
  1. The near clipping Plane defines the small end of the frustum (the point closest to the camera).
  1. The far clipping Plane defines the large end of the frustum (the point furthest from the camera).

You (Usually) Need a Light to See#

If we used nearly any other material type than MeshBasicMaterial right now, we wouldn’t be able to see anything since the scene is in total darkness. As in the real world, we usually need light to see things in our sceneMeshBasicMaterial is an exception to that rule.
This is a common point of confusion for newcomers to three.js, so if you can’t see anything, make sure you have added some lights to your scene, or temporarily switch all materials to a MeshBasicMaterial. We’ll add some lights to our scene in 1.4: Physically Based Rendering and Lighting.
 
By hiding the implementation behind a simple interface, you make your app foolproof and simple to use. It does what’s it’s supposed to do, and nothing else. By hiding the implementation, we are enforcing good coding style on the people using our code. The more of the implementation you make accessible, the more likely it will be used for complicated half-baked “fixes” that you have to deal with later.
 

Physically Based Rendering and Lighting

physically based rendering (PBR) has become the industry-standard method of rendering both real-time and cinematic 3D scenes. As the name suggests, this rendering technique uses real-world physics to calculate the way surfaces react to light, taking the guesswork out of setting up materials and lighting in your scenes.
 
Concept
  • physically based rendering: calculating, in a physically correct manner, how light reacts with surfaces
  • Physically correct lighting: calculating how light fades with distance from a light source  (attenuation) using real-world physics equations.
    • renderer.physicallyCorrectLights = true;
 
Lighting concept:
real:
  1. Direct lighting: light rays that come directly from the bulb and hit an object.
  1. Indirect lighting: light rays that have bounced off the walls and other objects in the room before hitting an object, changing color, and losing intensity with each bounce.
three.js:
  1. Direct lights, which simulate direct lighting.
      • DirectionalLight => Sunlight
      • PointLight => Light Bulbs
      • RectAreaLight => Strip lighting or bright windows
      • SpotLight => Spotlights
  1. Ambient lights, which are a cheap and somewhat believable way of faking indirect lighting. (Only emulate since achieve real indirect lighting is not computable achievable)

DirectionalLight

default put at the (0,0,0) and target to (0,0,0)

World Space

Cartesian coordinate systems

 
 
notion image
 
  • The positive X-axis points to the right of your screen.
  • The positive Y-axis points up, towards the top of your screen.
  • The positive Z-axis points out of the screen towards you.

Local Space

the top-level scene defines world space, and every other object defines its own local space.
JavaScript
// creating the scene creates the world space coordinate system
const scene = new Scene();

// mesh A has its own local coordinate system
const meshA = new Mesh();

// mesh B also has its own local coordinate system
const meshB = new Mesh();
 
notion image
 
To fully describe an object’s position, we need to store three pieces of information:
  1. The object’s position on the X-axis, which we call x.
  1. The object’s position on the Y-axis, which we call y.
  1. The object’s position on the Z-axis, which we call z.
We can write these three positions as an ordered list of numbers: (x, y, z).

Positions are stored in the Vector3 Class

Since .scale and .position are both stored in a Vector3, scaling an object works much the same way as translating it.
JavaScript
// when we create a mesh ...
import { Vector3 } from 'three';

const vector = new Vector3(1, 2, 3);

// ... internally, three.js creates a Vector3 for us:
mesh.position = vector;

vector.x; // 1
vector.y; // 2
vector.z; // 3

vector.x = 5;

vector.x; // 5

vector.set(7, 7, 7);

vector.x; // 7
vector.y; // 7
vector.z; // 7

Group

Groups occupy a position in the scene graph and can have children, but are themselves invisible
notion image
 
 
The difference between groups and scene objects
  • groups are purely organizational
  • groups exist purely to help you manipulate other scene objects
  • scene objects, like meshes, lights, cameras, and so on, have some other purpose in addition to occupying a place in the scene graph.
 
JavaScript
import {
Group
} from 'three.module.js';

const group = new Group(); // has add remove methods
 

Scale

JavaScript
// when we create a mesh...
const mesh = new Mesh();

// ... internally, three.js creates a Vector3 for us:
mesh.scale = new Vector3(1, 1, 1);

Rotate

Representing Rotations: the Euler class
JavaScript
const mesh = new Mesh();

// ... internally, three.js creates an Euler for us:
mesh.rotation = new Euler();


// need to reset xyz asix, if we want to do more roate since the asix changed after the first totation
mesh.rotate.reorder('yxz');

// or use quaternion, which doesn't have the order peoblem.
mesh.quaternaion 

 
Rotation order
By default, Three.js will perform rotations around the X -axis, then around the Y -axis, and finally around the Z -axis, in an object’s local space. We can change this using the Euler.order property . The default order is called ‘XYZ’, but ‘YZX’, ‘ZXY’, ‘XZY’, ‘YXZ’ and ‘ZYX’ are also possible.
 
Camera.lookAt will automatically rotate its z axis to toward the provided target.

The Unit of Rotation is Radians

JavaScript
import { MathUtils } from 'three';

const rads = MathUtils.degToRad(90); // 1.57079... = π/2

Anti-Aliasing

 
JavaScript
function createRenderer() {
  const renderer = new WebGLRenderer({ antialias: true });

  renderer.physicallyCorrectLights = true;

  return renderer;
}
We want to listen for the resize event, which fires whenever the browser’s window size changes. Rotating a mobile device from landscape to portrait, dragging a window between monitors on a multi-monitor setup, and resizing the browser by dragging the edges of the window with a mouse all cause the resize event to fire, which means the code we add here will handle all of these scenarios.
 
A basic game loop might consist of these four tasks:
  1. Get user input
  1. Calculate physics
  1. Update animations
  1. Render a frame

Animation

Creating the Loop with .setAnimationLoop
JavaScript
import { WebGLRenderer } from 'three';
const renderer = new WebGLRenderer();
// start the loop
renderer.setAnimationLoop(() => {
  renderer.render(scene, camera);
});

clock s

Clock.getDelta to measure how long the previous frame took.
JavaScript
import { Clock } from 'three';
const clock = new Clock();
const delta = clock.getDelta();
Call it at tick() method (which will be called in the setAnimationLoop), and we get how long the step between the last rate render to current.s

Texture

texture mapping means taking an image and stretching it over the surface of a 3D object.
We refer to an image used in this manner as a texture, and we can use textures to represent material properties like color, roughness, and opacity.
Texture Class:
The Texture class is a wrapper around an HTML image element with some extra settings related to being used as a texture instead of a normal image.
Methods:
  • Projective texture mapping (like a film projector)
    • creating shadows
  • UV mapping
    • the process of assigning 2D points in the texture to 3D points in the geometry.
    • (u,v)⟶(x,y,z)
 
notion image
Types of Texture
three.js supports many other types of textures that are not simple 2D images, such as video textures3D texturescanvas texturescompressed texturescube texturesequirectangular textures, and more.
Download
freepbr.com ,https://quixel.com/ choose the Unreal Engine version for use with three.js
Simple use:
JavaScript
function createMaterial() {
  // create a texture loader.
  const textureLoader = new TextureLoader();

  // load a texture
  const texture = textureLoader.load(
    '/assets/textures/uv-test-bw.png',
  );

  // create a "standard" material using
  // the texture we just loaded as a color map
  const material = new MeshStandardMaterial({
    map: texture,
  });

  return material;
}

Lighting Techniques

Multiple Direct Lights - add more direct lights in all the directions
No Lights at All! — other mesh materials like MeshBasicMaterial
Image-Based Lighting (IBL) - involve pre-calculating lighting information and storing it in textures
environment mapping (also known as reflection mapping)
 
Options in threejs
AmbientLight - adds a constant amount of light from every direction to every object in the scene (drawbacks: doesn’t show objects’s depth)
JavaScript
const ambientLight = new AmbientLight('white', 2); //color , intensity
HemisphereLight - Light from a HemisphereLight fades between a sky color at the top of the scene and a ground color at the bottom of the scene.
JavaScript
const ambientLight = new HemisphereLight(
  'white', // bright sky color
  'darkslategrey', // dim ground color
  5, // intensity
);

Load gITF format

Types:
  • Standard .gltf files are uncompressed and may come with an extra .bin data file.
  • Binary .glb files include all data in one single file.
 
Data Returned
  • gltfData.animations is an array of animation clips. Here, there’s a flying animation. We’ll make use of this in the next chapter.
  • gltfData.assets contains metadata showing this glTF file was created using the Blender exporter.
  • gltfData.cameras is an array of cameras.
  • gltfData.parser contains technical details about the GLTFLoader.
  • gltfData.scene is a Group containing any meshes from the file. This is where we’ll find the parrot model.
  • gltfData.scenes: The glTF format supports storing multiple scenes in a single file. In practice, this feature is rarely used.
  • gltfData.userData may contain additional non-standard data.
There are three elements involved in creating animations: keyframes, KeyframeTrack, and AnimationClip.

Other shapes

SphereBufferGeometry
notion image
 
 
notion image
 
 
notion image
 

useful methods

.clone allows you to create an identical copy of that object. (Only the default properties of an object will be cloned)
💡
Adjusting the transforms of meshes will not be shared. However, the geometry and material are not cloned, they are shared. For example, if we change the its color, both cloned and original will be changed. new Mesh(geometry, material); Avoid method: clonedMesh.material = new MeshStandardMaterial({ color: "indigo" });

Plugins

OrbitControls

a camera controls plugin which allows you to orbit, pan, and zoom the camera using touch, mouse, or keyboard.

GLTFLoader

To load glTF files

Playgrounds