Another project with Gmunk, I did an AutoTrader spot at Black Swan that featured a generative neuron network. I got to write a voxel volume exporter for closed geometry (using even-odd ray casting), and more importantly, I started exporting my art to OBJ and FBX so that it could be rendered in a 3d app. It was amazing to see my work with glass refraction. I also got to implement mouse-selection for the nucleii, and discovered a novel way to generate a circle with even segments along a straight axis using no trig functions (more on that later). Black Swan was an amazing new company to work at because it was a dream-team made from all the beloved members of our previous projects. I hope they do really really well. Proper respect to Matt Winkel, Nick Losq, Jake Sargeant, Chris Clyne, and Jacob Glaser.

ShyB: The Run

This was a triple music video for hip hop artist Shy B. The videos string together to form a short film. I did more traditional CG chores than I'm used to for this project. Normally, I'm hired to take care of the generative elements that need a code artist. This time, I did the 2D/3D rotos, matched shapes in Maya, painted photoshop masks, and did final compositing with the help of Becca Shostak. Shy's handwriting was added to the space, the words moving and reacting to the performer. In one shot, we replaced a license plate. In another shot, we added graffiti to the side of a moving truck.

Verizon FiOS

CHARLEX requested different kinds of wormhole animations for a TV commercial about Verizon FiOS. Video elements would be placed inside the wormhole as it progressed towards the O in FiOS. As I worked remotely, my on-site counterpart was Fabian Tejada who learned to tweak the app and produce renders from it. Fabian was wonderful to work with.

Tron Legacy

I spent a half year writing software art to generate special effects for Tron Legacy, working at Digital Domain with Bradley "GMUNK" Munkowitz, Jake Sargeant, and David "dlew" Lewandowski. This page has taken a long time to be published because I've had to await clearance. A lot of my team's work was done using Adobe software and Cinema 4D. The rest of it got written in C++ using OpenFrameworks and wxWidgets, the way I've always done it with this team ;) Uniquely however, Digital Domain's CG artists were able to port my apps over to Houdini for further evolution and better rendering than OpenGL could ever provide. Special thanks to Andy King for showing me that what seasoned CG artists do at DD is actually not so far off from what's going on in the Processing community.

In addition to visual effects, I was asked to record myself using a unix terminal doing technologically feasible things. I took extra care in babysitting the elements through to final composite to ensure that the content would not be artistically altered beyond that feasibility. I take representing digital culture in film very seriously in lieu of having grown up in a world of very badly researched user interface greeble. I cringed during the part in Hackers (1995) when a screen saver with extruded "equations" is used to signify that the hacker has reached some sort of neural flow or ambiguous destination. I cringed for Swordfish and Jurassic Park as well. I cheered when Trinity in The Matrix used nmap and ssh (and so did you). Then I cringed again when I saw that inevitably, Hollywood had decided that nmap was the thing to use for all its hacker scenes (see Bourne Ultimatum, Die Hard 4, Girl with Dragon Tattoo, The Listening, 13: Game of Death, Battle Royale, Broken Saints, and on and on). In Tron, the hacker was not supposed to be snooping around on a network; he was supposed to kill a process. So we went with posix kill and also had him pipe ps into grep. I also ended up using emacs eshell to make the terminal more l33t. The team was delighted to see my emacs performance -- splitting the editor into nested panes and running different modes. I was tickled that I got emacs into a block buster movie. I actually do use emacs irl, and although I do not subscribe to alt.religion.emacs, I think that's all incredibly relevant to the world of Tron.

HexVirus is a spherical map of the globe that features vector outlines of the continents. These continent vectors are slowly eaten away by a more hexagonal representation. Algorithmically, this is a path stepping function which looks ahead for the closest matching 60-degree turns. The HexVirus globe was used in the executive board meeting scene, and also inside the grid as a visual aid in CLU's maniacal plan presentation. In the board room interface, the globe element is surrounded by the lovely work of my team.

The scoreboard was the first element I worked on. I created a line-generator that produced bursts of lines which turned at adjustable angles. The line generator had "radial mode" which arranged the geometry in concentric circle form. This line generator was used to generate generic elements and layers of style in different things, and is a GMUNK favorite. At this point, I found myself moving to multisampled FBOs because the non-antialiased polygons were just too ugly to work with, and we needed to make film-resolution renders. In fact, this is the highst res I've ever seen my apps render.

Fireworks, mmmm. I started with a regular physics simulation where a particle has an upward force applied at birth, sending it upward while gravity pulls it back down resulting in a parabola. I then added particle-children, followed by various artistic styles, including what our team has called "egyptian" across several jobs -- which is a side-stepping behavior. We were trying to create fireworks that looked enough like real fireworks but had interesting techno-aesthetic. As a homage to the original Tron character Bit, we used icosahedrons, dodecahedrons, and similar. I was disappointed that Bit isn't in this one. After doing this simulation, I've grown more aware of how often fireworks are used in movies.

For the portal climax, the TronLines app was used, but also apps like "Twist" from our team's previous jobs. Once the look was mocked up by gmunk, a houdini artist recreated the rig for deeper control.

I wrote a particle renderer that could make the head holograms slurp in and out of the data discs. Special thanks to Keith Pasko for CLUing me in about using exponential functions to create a sliding-gooey sort of delay.

When fixing Quorra, there was an element in the DNA interface called the Quorra Heart which looked like a lava lamp. I generated an isosurface from a perlin-noise volume, using the marching cubes function found in the Geometric Tools WildMagic API, a truly wonderful lib for coding biodigital jazz, among other jazzes. The isosurface was then drawn along different axes, including concentric spheres. The app was mesmerizing to stare at.

After this project, I was fed up enough with wxWidgets and Carbon that I was ready to author my own OpenGL based UI. The most important thing I could use is a floating-point slider. I also got irritated with the way the Carbon sliders would not slide all the way to the minimum and maximum values. It totally messed with my zen thing. Also, after a job like this, it's clear that a member of the Processing community working within a CG community is greatly restricted by the differences of realtime graphics rendering engines, and that probably messes with an art director's zen thing.

"TRON: Legacy" © Disney Enterprises, Inc. All Rights Reserved.

LG Optimus Launch Conference

I worked with yU+Co on this spot featuring a globe of dots connecting up. I was very happy to see how creatively they composited the renders - using my simulation in ways I never expected. The system was easy to write since it resembled older work. It was great to collaborate online with yU+Co for the first time. I appreciate how tech-savvy everyone was, and I think that made the pipeline pleasant. I'm also grateful that everyone understood and was sympathetic to me being at Beit T'Shuvah.

IBM Data Baby

This project required a lot of research demo programs. The job holds the new record for most code artists (8) hired on one MTh job. Our apps began receiving animated curves from maya, we introduced a new speed-optimized OBJ sequence file format, and we continued to accumulate maya export scripts. At the request of director Kaan Atilla, I managed to write a bunch of C++ After Effects plugins with names like [FishBall, Stripes, SchizoPath, MeshSpikes, CurveConnector], but in the end I settled back into OpenFrameworks and wxWidgets because when you compete with an Adobe app for internal resources, the Adobe app wins. I'm also disappointed in Adobe's quality of documentation and examples. I was put in a 'lead code artist' position and I feel like I handled myself better this time. We learned a whole lot! Shout outs to new algo-collaborators Jeremy Rotsztain and Tim Stutts.

Buick - Behind The Beauty

Stillshot from the TV commercial

Stillshot from the TV commercial

For this project, we tried a handful of intelligent particle techniques. One of the approaches was to have me write a renderable simulator in OpenFrameworks that could fuse our specific mix of generative diagrams, numbers, shapes, and line-art into a magical fantasy breeze. The app was able to work with Motion Theory's special rotoscope and compositing pipeline, and this was really the first time I started using wxWidgets there. Because of that, the app featured enough user interface that a non-programmer could be productive with it. Four Adobe-savvy designers were able to run my app and art direct their own shots.


My friend Josh Gallant sees looping corkscrews when he closes his eyes. Now you can too with this application for fiddling with a cycling spiral form. The application features high resolution image export, and a fullscreen view. Often times when I work in a production environment, I'm just making a fullscreen app with no GUI besides mouse and keyboard interaction. I used wxWidgets and added a nice native user interface to this one particularly to show Jake Sargeant that it's possible, although takes a bit longer to code.


Watch on Vimeo.

withDRAWal is a series of ten interactive scribble styles commissioned by Graham Peet of The Public in West Bromwich, England. I programmed and tested the whole thing at Beit T'Shuvah. Each mode responds to the scribbling input in a different way. In writing this software using C++ and openGL in OpenFrameworks, I found myself using a maze solving algorithm, the Box2D physics engine, and Perlin noise. Instructions: draw with your mouse, and press the circle at the bottom left corner to move on. If you have a keyboard, press spacebar to clear the screen without moving on. Each number key corresponds to one of the ten modes. Press D to enable the debug overlay. Press - and + to adjust speed or fatness - use at your own risk. Press S to render one screenshot to the data folder. F toggles fullscreen. Q brings back the beginning instruction screen. If you're on a Mac portable that has a sudden motion sensor, mode 5 will let you tilt the screen.

Black Eyed Peas: Boom Boom Pow

by Keith

by Keith

Code was written at Motion Theory to generate content and effects in the Black Eyed Peas' Boom Boom Pow video. Amongst this highly collaborative effort were two other code artists: Keith Pasko and Ryan Alexander. After the video was finished, Keith and I went on to collaborate again on creating custom VJ software used by VSquared Labs in the Black Eyed Peas live tour. The VJ software processes a realtime video feed, and was written in C++ OpenFrameworks. The keyboard was filled with different behavior and content controls. If we had more time, we would have connected it to a Lemur device communicating via OSC. Once again a job well done with Motion Theory, and the beginnings of a relationship with VSquared Labs.

by Keith

by Keith

by Keith

by Keith

by Keith

by Keith

  • Page 1 of 4
  • Page 1 of 4