Russell Bicknell

I'm a student at The University of Texas at Austin studying computer science. More specifically, I'm interested in client-side web development. This page has a few projects / demos that I think are worth making publicly visible. Most of these were written and tested in Chrome Canary, because it's generally the most up to date on experimental specs; but most of these demos should work in newer versions of Safari / other Webkit based browsers.
WebAudio Synthesizer
How to use it:
Push the "MainSynth.start();" button.
"ZSXDCVGBHNJM" is a low octave, "Q2W3ER5T6Y7U" is one octave up; this looks complicated, but it's layed out like a normal keyboard as two rows. (Or, as in Native Instruments' Reaktor.)
Sorry, there's no interface for changing the effects applied as of now. You can pick around in the source and change it there if you really want to. By default, a lot of effects (chording, sample resolution downscaling, an ADSR envelope, and delay) are applied to a sine wave generator.

This demo is my second attempt at making a JavaScript synthesizer. The first was implemented using Mozilla's early audio API, but I stopped working on it and their ideas were deprecated in favor of the WebAudio API. I should also mention that I started again a third time, specifically because the majority of this code is redundant vs what the API already implements. However, it works fairly well - here's how. The AudioContext object (with appropriate vendor prefix) is how you access everything you need to make sound in JavaScript. The AudioContext has a destination node that you point output from other nodes to which is pushed to your speakers. In the case of this demo, it's a special type of node, a JavaScriptAudioNode, that I've hooked up to this destination. This node accepts some amount of input samples, modifies them, and assigns them to the output sample array (which was also passed in with input). Effectively my code implements a subset of the ideas used in the WebAudio API; nodes can be chained to other nodes, their final output is then pushed to the output of the JavaScriptAudioNode which is played by your machine. You can route the output of any node to any other accepting node. By default, for each 'frame' or requested length of saples, the Module class automatically caches the results of computation for each node and scales the output of a node to fit the input of the node requesting the samples. Because of this, nodes can be written without taking into consideration that they might be called multiple times to produce the same set of samples and the range of samples produced doesn't have to explicitly match the range for the input on a given node. The intent of this re-ranging is that any given node can seamlessly accept output from another node. For example, although the intent of 'generators' is to produce basic waveforms to be manipulated into output, because all inputs and outputs are arrays of Number, they can be used to dynamically modify properties of another node. (Or at least they're intended to work this way, most of the example nodes I've implemented use a property set on the object. They could be changed to accept this value as an input sample array relatively easily.) For example, a sine generator could be linked to both the waveform input and delay time input of a delay node. This would cause the delay time to change at the same rate as the sine wave, where the sine wave would be scaled to the input range of the delay time.

Like I mentioned before, this code is somewhat redundant given the functionality provided by the WebAudio API. This is mostly because I didn't have a complete handle on the API before I began writing this. Although this works, I've abandoned it in the mean time in favor of writing a wrapper around the WebAudio API that works similarly to this code but uses/provides access to native implementation where possible as well as being more friendly with hooking into code written strictly with the WebAudio API.
Space-filling layout library / mock iTunes interface
This demo was initially started with the idea of building a web interface to your iTunes library so that it would be accessible anywhere. I started with the UI because that's what I figured would be the most difficult and never really got past trying to deal with little issues with the code I wrote to do selective rendering of elements in the songs table so that it could deal with enormous lists. However, prior to getting stuck there, I ended up writing a library that makes it very simple to do scalable layout. (Now, I realize that this is effectively a JavaScript implementation of CSS flex layout.) This library allows you to specify a container element in which its elements are scaled dependent upon an attribute set upon them. This attribute can be in the form of either absolute pixels, or "[integer]n". The library takes care of scaling elements down such that their margin, border, and padding don't produce weird results when applied with either flowing style. Using the absolute pixel setting on a given element makes that child take up that much space within the element. Using the 'n' form, the element is scaled to take up the remaining space in the element with weight 'n' relative to all other elements in the container having 'n' weights. For example, if an element is 100 pixels wide and there are two elements, one set as '1n' and the other as '2n'; the first will be 34px wide and the second will be 66px wide. In the case that, like this example, the sum of all n weights doesn't divide evenly into the available space, remaining pixels are distributed amongst all the elements. You can see this library working in this example page.
WebGL globe
I've been looking into WebGL lately and this is what I've gotten to after a little bit of messing around with it. I'm not really sure where this demo is going. This is the state as of 2012-10-03. Click and drag to rotate. It's not perspective 3D as of now, just the default orthographic setup looking down -z. Also, the points haven't been interpolated to produce small segments. You can see artifacts of this at certain places on the globe. For example, the border between the United States and Canada is a reasonably long straight line. If you move it to the edge of the circle the globe is displayed in, you can see that the line isn't curved along a spherical surface, but goes directly from one endpoint to the next. I'm not sure there is any way to get around this other than to add additional points on sufficiently long lines so that they appear curved when translated to spherical coordinates. Maybe there's some way to get variables passed from vertex shaders to fragment shaders to interpolate non-linearly. Most of the work to get this going has been in extracting points from this SVG. The points in the SVG were converted to latitude and longitude from 0 to 1 with 90N 180W being (0, 0) and 90S 180E being (1, 1) using this file. The map building page has the relevant portion of the svg inserted as a node in the body of the page so that it's ignored in rendering, but still accessible through the DOM. The page then scans through the chunk of SVG and builds a map from countries to lists of paths where these paths are land masses controlled by the country. This object is dumped back out into the document as text after being run through window.JSON.stringify and I took this and stuffed it into a separate file, MAP.js. These points are fed into a vertex shader that maps them onto a sphere of radius 1 about the origin because this is a reasonable transformation, but mainly so that I don't have to do any translation or scaling to get it to fall into the visible bounds of the WebGL drawing context. Also, a few of these points are notably wrong: Antarctica is clearly not the shape shown; it looks like I incorrectly wrote the parser as if the lowest bounded area in the SVG was -90S when in reality it appears to be more like -(80-85)S. This might be true for the northern boundary as well. Russia's northeastern corner is also out of whack. Still haven't figured out what what is but my guess is that it's something in the SVG spec that this map uses that I didn't notice and didn't account for.

(example result)
Ray Tracer
This demo is a ray tracer I wrote in my spare time while taking the graphics course at UT. It has a fairly simple model for 3d: it only draws triangles, each of which can have colors set for each point. Although you can specify an alpha channel for the colors for these points, you can't set a refractive index for a triangle. The built-in setup includes two triangles which lie on intersecting planes and have partial transparency set on one point each to demonstrate that the ray tracer is correctly blending colors and transmitting rays through the triangles. The given setup also renders 4 rays per pixel for the 320x320 pixel canvas; this, along with the transmitted rays results in a total of 613791 rays being rendered for the scene. Because this demo isn't multithreaded or native, it takes my 3GHz Core 2 Duo machine approximately a minute to render the scene. Be patient, and click "Wait" when your browser whines at you.
Vector 'Map'
The idea behind this demo was to make a map (as in streets, not data structure) that would be continuously scalable in the sense that it could be drawn at an arbitrary scale and render the largest appropriate amount of detail for that scale (i.e. render every street that fits reasonably). In this way the map wouldn't appear to 'decide' that a certain viewport over the map was small enough to render a certain level of detail, resulting in those streets magically popping in when you reached that scale. At this point, the map is effectively just a set of lines which can be drawn at any scale and prevents attempts to render lines that are less than one pixel long (as they wouldn't make much sense to draw). Drag to pan, scroll to zoom (about the crosshair, not your mouse). If your OS / browser does some type of smoothing on your scrolling, this will be exhibited in the demo.

One interesting thing to note is that if you scroll in to the lowest level of the spirals / polar grid, you'll find that some of the lines stop rendering even though they are added to the list of segments to be drawn. This probably has something to do with the finite precision of the number representation of some stage between JavaScript and the canvas output. I would suspect JavaScript's use of IEEE754 doubles to be the culprit except that the breaking occurs at a scale of roughly only ~218 which seems too small of a change to be effected by this representation. Then again maybe my thought process isn't totally sound here; IEEE754 does have only 11 bits of exponent which may be relevant but it's 3am and I'm done thinking about this for now.
Asteroids
A game I wrote in spring 2011. It's definitely not a complete game, but it plays roughly the way you would expect of an asteroids clone.
W/A/S/D for forward / turn left / back / turn right. Space to shoot. 1 spawns asteroids (very quickly, don't hold it too long).
You can respawn at a cost of 1000 points, if you can't respawn, you'll have to refresh to play again.
Physics Simulator
A simple physics engine I started writing in 2008 and worked on from that point through 2010. This started as a high-school physics project but ended up being a good look into the canvas 2d drawing context and animation. This UI is really not the best (my thoughts at the time were to get the physics to work rather than concentrate on usability) and don't pay attention to my overly arrogant comments and silly version numbering. I thought that was cool in high-school. Feel free to sigh now.