Building re.flow Part 4: WebGL visuals

Building re.flow Part 4: WebGL visuals

600 100 Jason Marsh

This is the fourth part of a four-part set of posts about re.flow. If you missed part 1, go here.

Hear and see the end result here: http://dolby.flow.gl.

This post is about programming the WebGL visuals.

The connections between all the parts of the program objects are shown in this diagram:

CodeDiagramFor the visuals, I’m creating WebGL objects using ThreeJS, and GLAM is providing some handy utilities for tracking your scene graph, handling interaction with Pickers, easy key-frame Animation, and more.

Animation

Let’s start with the animation before we get to the textures. When I create an AudioClip, the animation is parameterized and included with it.

[code language=”JavaScript”]
new AudioClip( {name: "spaceshwoosh1", file: "spaceshwoosh1", track: "spaceshwoosh", label:"Space Shwoosh 1",
animation : [{//duration in musical measures , each step is 4 measures
duration:18,
object: "parent",
target: "rotation",
keyBeats : [ 1, 2.75, 4.5, 7.5, 9, 18], //keys should be in beats
values: [ //values should be in radians around center
degreesToYradians( 56, 1),
degreesToYradians( 37, 1),
degreesToYradians( 71, 1),
degreesToYradians( -171, 1),
degreesToYradians( -73, 1),
degreesToYradians( 56, 1),
] },

] }),

[/code]

AuditionDegreesI cheat to do the animation: Instead of building a series of spline curves through space (I tried this too, but it was too hard synchronize with the audio panning), I just add an extra invisible object to each 3D Object, and spin the pair around their mutual center point. That way I can move the objects in a plane around the Y axis in front of the user with nothing other than the degrees, which happens to be exactly what Audition shows in the interface!

As a good lazy programmer, I wanted to make it as easy as possible to synchronize the audio panning to the animations. So I created math to translate the same numbers in Audition directly into the key frame animator, hence the degreesToYradians function:

[code language=”JavaScript”]
/*
This function is specific to taking the values directly out of Adobe Audition’s 5.1 track panning and converting them into a horizontal plane around the user.
It inverts the values, and subtracts the 180 degrees in order to match those values.

The animator needs to be sweeping across the circle, not just animating between points. Otherwise anything that is moving relatively fast will not work right.
*/
function degreesToYradians(degrees, radius) {
var y = (Math.PI * -degrees /180) * radius ; //radians = Math.Pi * degrees /180
return {y:y}
}
[/code]

auditionKeyframespngAnd the same is true for the keyframe timing: I create a keyBeats structure so I can just type in the measure numbers that match the keyframes from Audition.

Here’s the code for creating the geometry plus the invisible partner to create the offset for the rotation:

[code language=”JavaScript”]
function addTexturedEtherPlane(trackName, texture, position, rotation, spinRadius) {
var parentObj = new glam.Object;//we use a parent object to create an offset
var obj = new glam.Object;
parentObj.addChild(obj);

var visual = addTexturedEtherPlaneVisual(trackName, texture);
var picker = addPicker(trackName);
obj.addComponent(visual);
obj.addComponent(picker);
spinRadius = spinRadius ? spinRadius : 1;

//sets the position of the pair in space
position = position? position : {x:0, y:0, z:0};
parentObj.transform.position.set( position.x, position.y, position.z + spinRadius );
obj.transform.position.set(0, 0, -spinRadius*2);

if (rotation) {
obj.transform.rotation.set(rotation.x, rotation.y, rotation.z);
}

//connects this visual object up to the AudioVisualizer object so when the
// track starts and stops, they stay synced, as well as to send the amplitude
// changes to drive the visualizer
var audioVisualizer = findAudioVisualizer(trackName);
audioVisualizer.trackObj = obj;

app.addObject(parentObj);
}
[/code]

On to the dynamic textures!

 

GLSL Textures

I had never done GLSL before this project. But, no fear, this is the age of Google. Specifically, the awesome ShaderToy. I hunted all around for a shader that I thought would give me an interesting rippling effect that wasn’t too computationally expensive. I found Ether, by nimitz, on ShaderToy. and the audio amplitude as an input to brighten the colors as the volume increased.

I wanted the textures to be ‘beautiful’ as defined by my own subjective criteria, which is like observity obscenity: I’ll know it when I see it. Actually I did want to use natural textures, hinting at hypnotic complex repetitive forms such as campfires and ocean waves. So I started with these original images I photographed and photoshopped:

tahoesolanabeachsentinelstackedGrasses

I combined the Ether shader with the animating texture maps based on code examples from the Shader Fireball by stemkoski.

Without much explanation, here is the fragment shader for the nerdiest readers:

[code language=”JavaScript”]
uniform sampler2D baseTexture;
uniform float baseSpeed;
uniform float alpha;
uniform float time;
uniform float amplitude; //audio volume as an input into the math
uniform vec3 color;

varying vec2 vUv;
varying vec3 vPosition;

float map(vec3 p){
vec3 q = p * 2.0+ time * 1.0;
return length(p+vec3(sin( time * 0.7)))*log(length(p)+1.0) + sin(q.x+sin(q.z+sin(q.y)))*0.5 – 1.;
}

void main( ){
//texture
vec2 uvTimeShift = vUv + vec2( -0.7, 1.5 ) * time * baseSpeed;
vec4 noiseGeneratorTimeShift = texture2D( baseTexture, uvTimeShift );

//ether
float resolutionY = 1.; //smaller the value, the more detailed the image
vec2 p = vec2(vPosition.xy) / resolutionY – vec2(0.5, 0.2);

vec3 color = vec3( amplitude) * 2.0;
float crispness = 4.5; //values between 0.5 and 5.5 work nicely
for(int i=0; i<=2; i++) { //iteration values between 2 and 10 work nicely
vec3 p = vec3(0, 0, 5.0) + normalize(vec3(p, -1.0)) * crispness /2.0;
float rz = map(p);
float f = clamp((rz – map(p+0.1))*0.5, -0.1, 1.0 );
vec3 colorRange = vec3(noiseGeneratorTimeShift.r, noiseGeneratorTimeShift.g, noiseGeneratorTimeShift.b)*0.8+ vec3(5.0, 4.5, 0.5) * f;
float amplitudeClamped =clamp(amplitude, 0.1, 20.0) *2.0;
color = color * colorRange + (1.0 – smoothstep(0.0 , 2.5, rz)) * 0.7 * colorRange *amplitudeClamped ;
crispness += rz; //min(rz, 1.0); //filters out some complexity
}
float calculatedAlpha = (color.b + color.g + color.r > 0.02)? alpha : 0.1 ;
gl_FragColor = vec4(color, calculatedAlpha);
}

[/code]

And here is the ThreeJS code to tie an AudioVisualizer track to the shader material:

[code language=”JavaScript”]
function addTexturedEtherPlaneVisual(trackName, texture ){
var audioVisualizer = findAudioVisualizer( trackName );
audioVisualizer.setShaders({
vertexScript:"texturedEtherVertexShader", fragmentScript:"texturedEtherFragmentShader"
} );

var waterTexture = new THREE.ImageUtils.loadTexture( texture );
waterTexture.wrapS = waterTexture.wrapT = THREE.RepeatWrapping;

audioVisualizer.shader.uniforms = new ShaderUniforms( {
color: {type: "c", value: new THREE.Color( 0xff0000 )},
baseTexture: { type: "t", value: waterTexture },
baseSpeed: { type: "f", value: 0.1 },
noiseTexture: { type: "t", value: noiseTexture },
noiseScale: { type: "f", value: 0.2 },
alpha: { type: "f", value: 0.8 },
time: { type: "f", value: 1.0 },
amplitude: {type: ‘f’, value: 1}
});

var planeShaderMaterial = new THREE.ShaderMaterial({
fragmentShader: audioVisualizer.shader.fragmentScript,
vertexShader: audioVisualizer.shader.vertexScript,
uniforms: audioVisualizer.shader.uniforms.params,
side: THREE.DoubleSide,
transparent: true,
depthWrite:true
});

var geometry = new THREE.PlaneGeometry( 2,2 );

var visual = new glam.Visual(
{ geometry: geometry,
material: planeShaderMaterial
});

return visual;
}

[/code]

Particles

Finally, while I like the clean black background, I wanted to create a sense of an even larger space, so particlesI added particles in the distance. Some are fixed particles randomly dispersed, and some are based on the “fireflies” particle engine created by Lee Stemkoski: https://stemkoski.github.io/Three.js/Particle-Engine.html . I tweaked it a bit, including a more textural texture map, but they are almost straight from his open-source code.

 

Pickers

ThreeJS doesn’t have great built-in selection mechanisms for the 3D Objects. IT is not all that trivial to select an object in 3D space. So GLAM solves this for me with a few lines of code:

[code language=”JavaScript”]
var picker = new glam.Picker;
picker.addEventListener("mouseup", selectFunction);
picker.addEventListener("touchend", selectFunction);
obj.addComponent(picker);
[/code]

That’s it!

re.flow for Dolby Laboratories, 3D object view

So, a quick review: I wrote a piece of music, found some images, and through Google searches and a simple matter of programming, created “art”. You can do it too.

 

Huge Thanks to Andrew Vaughan at Dolby Laboratories, Tony Parisi for GLAM, and also to Dustin Butler for getting the servers set up and the DynamoDB database on Amazon.

 

Follow me @jmarshworks on Twitter, and stay tuned for the Virtual Reality version.