Pixi.js Renderer, Ticker & Stage

In the last post I showed you how to get up and running quickly in pixi.js. In this episode I want spend some time diving a bit deeper into the basics of pixi.js. To do this we have to look a bit at the history of pixi.js to understand where it's been and where it's heading.


Pixi.js Getting Started

To download the latest version of pixi.js you can check the releases tab of the github repository here.

Because of browser security restrictions loading local assets directly will result in CORS errors. We'll need to get around this in Pixi.js because you can't do much without loading textures. The easiest way to avoid this is to run your code on a server. You can run a local server if you have python installed by opening terminal navigating to your project directory and running python -m SimpleHTTPServer 8000 which will run a local server at localhost with the port defined: localhost:8000. Now simply loading this page should run the index.html file at your project directory without any issues.

The quickest way to get started with Pixi.js is with a helper class called PIXI.Application. This class combines and abstracts a bunch of common utilities used when making content with Pixi.js.

const app = new PIXI.Application({
  view: canvas,
  width: window.innerWidth,
  height: window.innerHeight

Pixi.js Introduction

Pixi.js is a 2D Graphics Rendering framework for the web.  It was created by Matt Groves to normalize the canvas and webgl APIs into a simple scene graph originally modeled loosely on the Flash API.  Its primary focus has always been on performance so that it is really fast.  Because of this it is primarily a webGL rendering surface with canvas 2d fallback.

To get Pixi.js and read more about it you can visit its website pixijs.com.  There they have download links examples and other resources for learning as well as a link to the github repository to keep up to date with its development.

Check out the Bunny Mark performance test to get an idea of how fast Pixi is.

You can find the code used for the video intro below.

See the Pen Pixi.js Intro by CJ Gammon (@cjgammon) on CodePen.


Three.js Models Update

In my post on Loading Models in Three.js I was using version 79. Three.js is now at version 100 and some changes have been made to how models are loaded. In this video I describe how to export glTF models from Blender and import them into Three.js. You can read the documentation on loading models into three.js here.


Three.js Loading Models

In my post on Geometry in Three.js I discussed the different types of primitives we have available. Primitives are a powerful tool but creating 3D objects with only primitives and code can only get you so far. For full control over your 3D models it is best to use a 3D modeling application and then import your models into Three.js.


Three.js Post Processing

Post-Processing is the addition of image effects or filters to your entire scene. This can change the feel of your scene and simulate interesting visual effects. Some examples are applying a sepia tone, or adding static the scene, giving it the feel of older television sets. To achieve this in Three.js we utilize shaders. The process involves creating an EffectComposer and then chaining together effects by adding passes to it. Passes are how we define the sequence of rendering and effects in the composer. There are different kinds of passes that achieve different results. I should also note that the classes and files used such as EffectComposer and the built in passes are not technically part of Three.js, but can be found in the examples included with the library.


Three.js Lights and Cameras

Lights can really make the difference between a seemingly flat scene and a visual masterpiece. Think of any photo-realistic painting or photograph and then imagine it with poor lighting and the impact is just not the same. Cameras change the way we view our scenes altogether, think of the different types of lenses photographers use and how they can influence the perspective and depth of a photo. Lights do not work on all materials. Lights do work with MeshLambertMaterial, MeshPhongMaterial and MeshStandardMaterial.


Three.js Custom Materials with ShaderMaterial

Three.js comes with many materials built in. All these materials drawn in WebGL utilize shaders. Shaders are small programs that run on the GPU written in GLSL. We can create our own custom materials in Three.js by writing our own shaders and passing them into a ShaderMaterial, which we can then use in our scene.


Three.js Materials

Materials determine how the surface of our geometry is drawn in Three.js. If the Geometry is our skeleton, defining the shape, then the Material is our skin. There are a variety of different types of materials in Three.js all of which have different properties, like responding to lights, mapping textures, and adjusting opacity.


Three.js Geometry

Geometry defines the shape of the objects we draw in Three.js. Geometry is made up of a collection of vertices and often faces which combine three vertices into a triangle face. You can create your own custom geometry by defining these vertices and faces yourself, but Three.js also has a variety of common shapes for you to access and set properties of built in.