WorkPosted on November 2016
Codevember 2016
This are the projects created for Codevember 2016 -a challenge for developers creating a creative sketch by day during the whole month of November.
Initially I wanted to mix very different browsers technologies: CSS, SVG, WebGL, and not limited to just visual effects. In the end i went with WebGL and three.js, creating a basic boilerplate and starting from scratch every project. Of course as I went, the toolbox grew considerably, with things like ShaderTexture, Odeo and a lot of shaders and helper methods.
I've tried to make all sites work in as many platforms as possible, but the time constrains don't allow for a lot of beta testing and bug hunting. Wobbly Earth, for instance, doesn't work on my Nexus 5, but works everywhere else I've tried. I have reports that Lines in a Sphere doesn't go well with MacBook Airs. If you find any issue and want to help debug, tweet me.
So this is all I've been able to churn out, between work, two kids, broken
eyeglasses...
Just don't judge the code too strictly :)
Tip: in all the effects with music (10, 16 and 30) you can open the console and run odeo.playSoundCloud(soundcloud_url) and you can change the track.
Extra tip: in all the effects with FBOS, shadows, etc. you can probably open the console and run helper.show(true) to see the FBOHelper.
Source code for all experiments -and discarded ones- is here:
I had the "brilliant" idea to do each codevember day and then post all of them on twitter at the end of the month, in one single long thread:
Day 1 - Procedurally textured torus
A.k.a. "The decadent golden-plated knot torus"
The original idea was to procedurally create a wood texture, it ended up being the base for building ShaderTexture, a THREE.Texture that helps creating procedural textures using fragment shaders.
Using ShaderTexture to create a diffuse texture (using TruchetFlip's implementation of Wang Tiles) and a roughness texture (using the diffuse texture to modulate some noise), then again using the diffuse texture as a height map to turn it into a normal map (using a shader based on NormalMap Online).
The maps are then used as diffuse, roughness, metalness, bump and normal maps of a THREE.MeshStandardMaterial.
Day 2 - Dotty Earth
All is full of dots
An example of building a sphere with dots, making sure there's a uniform density of dots across all the surface. In naive distributions, there's way more points towards the poles than necessary.
The algorithm accepts different point sizes and differents spacing between points.
All points are GL_POINTS, using distance to the center and discard to make the look like dots.
There are two buffers, one for vertices and another for colors. Whether there's a dot or not is decided by looking up a mask image loaded in a canvas. The color of the dot is also read from a color texture (earth and clouds), also loaded in their respective canvases. The position of the dot is displaced in the normal direction based on a lookup of a heightmap texture.
All images are equirectangular panoramas so can be mapped into a sphere. UV spherical coordinates from the sphere surface need to be adapted to be lookup 2D coordinates for the equirectangular panoramas.
All geometry is drawn in a single draw call.
This experiment uses DeviceOrientation events if on mobile!
Day 3 - Strange Attractors
Strangely attractive strange attractors
Rendering strange attractors.
Several points are created evaluating a strange attractor equation.
From those points, a line geometry is created, a MeshLine geometry is created, and particles (billboarded triangles) are spawned along the path. MeshLine width and particles sizes and spread is based on distance along the path.
Finally everything is rendered to a framebuffer (3 draw calls, one for each geometry), then some post-processing is applied. First, a slight RGB shiftb. Then, a tilt shift shader performing a directional blur pass on the vertical axis, based on vertical distance to the center. Finally, a final pass performs FXAA, add vignette, noise and performs gamma correction.
The fade in and fade out animation is done with a framerate-independent easing function that modified the draw ranges of each mesh.
This experiment stores the attractor parameters in the URL so it can be shared!
Day 4 - Trencadís
A surface shader to make a plane look like glazed porcelain.
Using the ShaderTexture from the Procedurally Textured Torus and some voronoi functions by iq from Shadertoy, several maps are generated: diffuse, roughness and metalness; height and ao; and normal.
The process is very similar to the Procedurally Textured Torus. The shader that creates the base for the texture combines basic voronoi to simulate the different shards, modulated with iq's voronoise to replicate some of the waviness of traditional tile. Everything is blurred a bit with a gaussian blur, and then a normal map is calculated from the height map. The normal map is then blurred a bit more.
The scene has two lights and a plane, using the procedural maps on a THREE.MeshStandardMaterial, plus a panorama converted to a cubemap to use as environment map.
Day 5 - Particle field
Oldschool plasma moving particles with motion blur
Creating a particle system with motion blur as a post-processing effect.
The particles positions are stored in an RGBA texture (xyz and previous z). The particles texture is updated in the CPU with a plasma function (several linear and circular gradients added), keeping the x and y in their gridded position, and moving the z axis.
The particles geometry are merged into a single geometry (no instancing). There's a shader for the diffuse data, and another for the motion data; both will be used in the motion blur pass. Both vertex shader take the particle position and move the sphere geometry to the correct location. The color fragment shader outputs color based on z position. The motion vertex shader calculates the direction vector and elongates the sphere in that direction (comparing the dot product of the vertex normal with the direction); its fragment shader then outputs the screen-space velocity of the particle.
The last pass, gathers both buffers and performs a blur averagin 20 samples from the color buffer in the direction of the data in the motion buffer.
Type helper.show(true) in the console to see the framebuffers.
Day 6 - Wobbly Earth
The day the Earth didn't stand still
Adding a custom vertex shader distortion keeping all the features of THREE.MeshStandardMaterial (PBR, shadows, etc).
Using MeshCustomMaterial to build a material with custom vertex shader and the default physical fragment shader.
The distortion are several 3D perlin noise with different scales, added up, that move the vertex along the normal. The tangent and binormal are calculated with partial derivatives of the noise function, in the vertex shader; and the new normal calculated from the cross product.
Also using a ShaderMaterial to build a custom depth material so the shadows can reproduce the object's distortion.
The rest is all the MeshStandardMaterial rendering features.
Day 7 - (Very) Poor Man's Global Illumination
Computing Global Illumination (GI) by rendering the scene from every vertex... four times!
GI is calculated per vertex, so the mesh is tessellated to have enough detail for the shadows and bounces to be seen.
At each vertex, a camera is placed looking along the normal, and the scene is rendered into a framebuffer with a very wide FOV. The framebuffer is 32x32, and it's then averaged to get an approximation of the incoming light in that vertex. That color is then set as the vertex color.
The process is repeated, now with the scene lit by the direct light (baked into the vertex colors). We get a second bounce, and update the vertex colors. We do that process 4 times, and get an interesting -albeit slow- result.
To not block the main thread, the work is batched, calculating a few vertices every frame.
Warning: this is a very slow and inefficient way to calculate Global Illumination (GI) in the browser!
Day 8 -Metaballs
Shiny striped metaballs
Using CPU marching cubes to create metaballs, and similar to Wobbly Earth, using a custom MeshStandard material, this time to also modify the fragment shader, and add a striped procedural texture to the metaballs, and modify the normal to add some bump across the alternating lines.
The fragment shader computes a color based on the y position of the fragment, and calculates the discontinuities between lines, perturbing the normal accordingly.
The rest is all marching cubes and MeshStandardMaterial.
Day 9 -Street View PBR
Hacking image-based lighting (IBL) into a material
Modifying a MeshStandardMaterial trying to add Image Based Lighting (IBL).
The idea was to modify the lighting equation to add a diffuse and specular term extracted from the equirectangular panorama fetched from Google Street View. It's still too complicated to modify the shader at that level.
Day 10 - Odeo
Music visualisation to demo an audio manager
Build a media player class that can be easily integrated, supporting media playback, SoundCloud streaming and getUserMedia streams, using Web Audio API.
It also uses AnalyserNode to perform an FFT, and can upload the spectrum to a GPU texture.
Can work together with kick detection.
SoundCloud stream is done via their API, connecting the stream source to an HTMLAudioElement and creating a MediaElementSource from that.
The microphone can be used via navigator.getUserMedia().
I didn't have time to implement media playback, it would be done with MediaElementSource, too.
All nodes are connected to an AnalyserNode and to the AudioContext destination.
The visualisation is done in a single draw call of particle-triangles, with the color from a gradient texture. Positions are updated in CPU, from the FFT data.
Day 11 - Blocky Water
Oldschool water with lots of AO
Lots of cubes running on a GPGPU water simulation.
The cubes code is recycled from The Polygon Shredder, removing code that is not needed for this effect.
There's three buffers for the water simulation to add more stability. The positions are stored in a GPU texture, the shader samples values around the point, averages and diffuses them.
The positions are taken by the cubes shader, that place half-cube geometries, elongated in the z-axis according to the water buffer value.
The cubes are rendered with a shader that exports positions and normals in view space (very much like a deferred renderer, but no multi rendertarget).
Those buffers are used by the SSAO shader to darken the creases in the depth.
A combine pass composed color and AO, and a final pass performs FXAA, tonemapping and gamma correction.
Day 12 - AO Blocks
Take a walk on the occluded side
More experiments with SSAO, just ambient occlusion and no lighting.
The landscape is built by merging cube geometries, distributed along a perlin noise function, orientated randomly along the normal, with variable scales.
The camera follows the same noise function.
All the cubes are rendered to a color buffer, a normal buffer and a position buffer. The SSAO shader creates the occlusion buffer. A combine shader darkens the color buffer with the ao value, and a final shader performs FXAA and other cosmetic changes.
There's a commented out DOF pass, but it was too much to run smoothly.
This experiment uses DeviceOrientation events if on mobile!
Day 13 - Sphere Impostor
Don't use many triangles when you can use just one!
Experiment creating sphere impostors out of particles, rendered with MeshStandardMaterial.
The big sphere in the center is an actual sphere geometry. The rest are billboarded triangles.
I got the lighting right, but I couldn't get the shadows to work properly. This needs more time to investigate!
Day 14 - Clock flag
See time fly
A flag showing the time
The flag geometry is animated in CPU using constrained particles and Verlet integration, from the cloth simulation example.
The texture is drawn in 2D canvas and uploaded to the shader.
There are different changes of wind force every second, every minute and every hour.
Day 15 - Crystals
Generative rocks, or minerals, Marie
Generated crystals, with a ton of postprocessing!
The crystals are based on a base geometry that is distorted (more along the z-axis), displaced around a common center, merged together into a final geometry that has then all its vertices slightly jiggled.
There are two lights with random HSL colors, that cast shadow.
The whole scene is rendered to a color buffer, depth to a depth buffer, and both buffers mixed in a pass that performs Bokeh blur according to the distance.
A final pass performs FXAA, vignetting, color grading and noise.
Day 16 - Torus
Music visualisation simulating a phosphor oscilloscope
Odeo.js exports the frequency spectrum data to a texture. It's then added into a tall buffer and a shader move each row downwards, drawing the new frequency buffer at the top.
That texture is then used to distort a series of lines. The lines are distored following a sinusoidal shape with frequency varying with the sound. The actual values of frequency in the buffer are displacement of the lines on the normal direction (the normal vector for the sinusoidal shape is calculated on the shader).
Everything is drawn to a color buffer, and then a glow pass is performed (using different levels of mipmaps to avoid blurring), color and glow are combined, a tilt-shift effect is performed and a final pass performing FXAA, and adding vignette, interlacing and oscilloscope grid.
Day 17 - Conway
A GPGPU Conway's game of life, with PBR, shadows and AO.
The GPGPU Conway version uses ping-pong textures to run the simulation.
The current position texture is used to determine the state of the cell when drawing. Each cell is a box geometry merged into a bigger one.
The cell object is rendered with a custom version of MeshStandardMaterial, adding the lookup to the nearby cells to determine the degree of ambient occlusion. The fragment shader then adds the AO term with the rest of the BRDF in the Physical material.
Day 18 - Starfield
It's a very small universe
Another version of oldschool cracktro starfields.
The stars are icosahedron geometries, all merged together in its final position. Colors are vertex color based.
The direction of movement is calculated using the previous frame's transformation matrices and the current frame's. The star geometry is the elongated along the motion direction (like in Particle Field).
Everything is rendered to a color buffer, glow is applied via a shader (using the mipmaps technique) and finally FXAA and vignette is applied.
Day 19 - DOF Tunnel
Long tunnel with lots of postprocessing
Experimenting with different combinations of shader passes to get a reasonably nice Depth-of-Field (DOF). It's a bit too soft, but it does work.
Path is a THREE.Curves.TorusKnot, and it's used to move the camera and generate a tunel with THREE.TubeGeometry.
The tunnel is rendered to three different framebuffers: diffuse color, only glow textures and depth buffer.
Then glow and diffuse are combined in a pass that uses bias on texture2D to create a soft glow out of the glow texture. The glow texture is smaller than the diffuse texture, but there's no blur involved. The glow strength is used to add a dirt lens texture on top.
Then the combined diffuse+glow texture has a poisson-disc based blur performed, modulated by the depth buffer, to create the DOF efffect.
A final pass performs FXXA, adds a bit of noise and vignetting.
This experiment uses DeviceOrientation events if on mobile!
Day 20 - Lines on a sphere
Lines and streaks moving in the surface of a sphere
Turning a particle system into an equirectangular panorama texture.
The basis is a regular particle system (THREE.Points), moving with curl noise.
The particles are then processed by a vertex shader that turns the cartesian coordinates into cylindrical coordinates, so it can be rendered as an equirectangular panorama.
That buffer is then used as a texture for a sphere.
A fading shader is applied to the panorama buffer before rendering the particles to add some streaks.
This experiment uses DeviceOrientation events if on mobile!
Day 21 - Numerical renderer
Rendering with numbers (very digital!)
Like those ASCII renderers, but with less characters: just 10.
A simple scene rendered with lights, shadows and MeshStandardMaterial is rendered to a framebuffer.
That framebuffer is then processed to add some glow, and then a scaled down version is used to feed a shader that will replace every block of 32x32 pixels with a part of a texture that has the numbers 0 to 9, according to the brightness of the block. The number is aditionally shaded by that intensity, so there's more gradients. The texture with numbers has a slight glow preapplied on it.
Day 22 - Line thickness
Creating an engraving effect with shaders
Simulating the effect of linocut, or wood engraving.
The image is uploaded to a texture, and a fragment shader generates a few curvy lines based on a sinus, and modulates the contrast of those lines with the intensity of the uploaded image, effectively changing the width of the line.
Day 23 - Emitter
Weird thing that started as a particle emitter and ended up like Matrix sentinels
Toying with the spring-like strands, using Verlet integration and constrains to animate the strings.
Cannon.js is used for the physics simulation. Each spheres is linked with a Cannon.js body, and a set of points is randomly chosen from the surface of the sphere. Lines are attached to those points. When the simulation runs and modifies the spheres positions, the changes are translated to the random points, and those are used as anchors of the beginning of the string.
The rendering is pretty straightforward, with just MeshStandarMaterial and shadows.
Day 24 - AO Spheres
Boids and soft shadows and analytical ambient occlusion
A bunch of spheres that try to stay together -but not too together-, with soft shadows and ambient occlusion.
This one is the result of testing a combination of techniques.
The sphere-sphere ambient occlusion is analytically found following Iñigo Quílez's Sphere Ambient Occlusion.
The soft shadows are part of the raymarching toolbox at ShaderToy and IQ's Modeling with Distance Functions.
The rendering combines solid color, some lambertian diffuse term, ao, soft shadows and rim lighting.
The motion is an ad-hoc steering behaviours that doesn't really work that well.
The spheres use a custom LOD system, based on BufferGeometry and a custom version of three.js
I changed the background from white to black because the lighting is already fake, so let's make it look different :)
Day 25 - Lines
More lines and feedback effect
The first version of this effect was distorting the lines geometries in CPU, using ImprovedNoise. I moved the code to a shader.
The lines are distorted by calculating the 3D value of a perlin noise, and calculating the normal from the gradient in that point of the field. The vertices are pushed along that normal, creating inflated distortions.
The lines are rendered into a framebuffer that has a version of itself drawn on it, slightly upscaled and faded, to create the feedback effect.
A final pass applies FXAA, vignetting, noise and gamma correction.
Day 26 - Volumetric rendering
March your ray through a cloud
This experiment uses the technique of rendering view positions of a shell (a cube, or a sphere), in a rendertarget. One with the back face (interior of the shell), and another with the front face (sides facing the camera).
With that information, a shader can calculate a ray for each point in the screen that crosses the volume, and use that ray to step through a 3D field (or a 3D texture).
In this case is a turbulent field made of several scales of perlin noise.
This experiment uses DeviceOrientation events if on mobile!
Day 27 - Medusa
Random jellyfish-like objects
The idea was to create those tentacles following a path.
The tentacles are created following a perlin noise function. The points created with this function are used with a TubeGeometry to create a tentacle. This version uses a modified three.js version to add a radius function, so it can be tapered.
The cap is done with a Lathe geometry, and the shape is created with a THREE.ConstantSpline.
All geometries are merged into one.
The shading is made in different parts. There's one pass that exports diffuse color in the red channel, eye-space depth in the green channel and rim lighting in the blue channel; all that goes to the base framebuffer>. With a MeshStandardMaterial, the same scene is rendered with shadows to a shadow buffer. Both textures are then combined in a shader to create a blue-tinted image with different properties.
The same buffer is downscaled and processed with a poisson-disc-based blur, and another shader takes the combined and the blurred fbos, and blends them according to the pixels distance from the camera. It finally also adds some vignette. There's no need for FXAA in this one.
Day 28 - Greeble AO
Rotating greeble, with SSAO
A greebled torus, with AO. Chevron one encoded!
This SSAO is based on Nutty Software SSAO demo.
The segments are independent geometries, built from bent box geometries. I tried cylinders using the theta parameter, but the resulting meshes are no capped.
The main shader creates a color buffer, an eye-space normal buffer and an eye-space position buffer. The SSAO shader takes normal and position and creates an occlusion buffer. A shader combines the color and the ao, and a final shader performs FXAA, vignetting and noise.
Day 29 - Bands
Twisty pieces of sliced floor
Recreating an animated gif.
The bands are created as plane geometries, and they're moved in CPU. The rendering is done with MeshStandardMaterial and a few lights.
I added a light from the botom to simulate a secondary bounce.
Day 30 - Triangle tunnel
A WebGL remake of GMUNK's PYRADICAL
The geometry are 3 planes, manually oriented.
The texture is built in canvas, using Odeo.js and Kick to drive the graphics, with a couple of nice functions to modulate the values and add some variety. The canvas operations have some opacity and are composed with lighter mode, so they saturate slowly in and out.
The texture is updated to the GPU, and the classic sequence of building FBOs is done: color, depth, rgb shift, blur, glow with flares, combine.
Feel free to changed the updateLEDs function!
This experiment uses DeviceOrientation events if on mobile!
Want to see more?
See also Codevember 2017, Codevember 2018, Codevember 2021 and Genuary 2022. You might be also interested in Digital Inktober 2018, Digital Inktober 2019, Digital Inktober 2020 and Digital Inktober 2021.