s
Projects
CLOUDS Interactive Documentary

James George and Jonathan Minard co-directed an interactive documentary on the people in my scene. I was happy to be one of the interviewees and be one of the commissioned code artists. Here was the beta teaser on kickstarter:

I was responsible for implementing two of the open source visual systems: BallDroppings and Neurons. Being a part of this project was a very high honor for me and I only regret not being more available to involve myself during its production.

More info about CLOUDS on Creative Applications

and a good interview with them.

I'm posting this long overdue portfolio entry in celebration of CLOUD's screening at MoMa. In the tradition of these portfolio entries which share a small bit of the process, and indulging in the open-ness of the project, I'll describe my experience contributing.

We met up at a medium sized hackerspace in the Tenderloin district of San Francisco which was associated with the oooShiny tribe.  James, John, and a few others had already set up shop there when I arrived from Silicon Valley. A thin layer of ephemeral user network cables blanketed over the tables and floors, crawling at times like vines up the walls. USB, power, video, hot swapping in realtime to the crowd's murmur. It was a lot of macbooks, tripods, and hoodies. Exploratorium toys strewn about the space as they endearingly are year round. A dusty, cracked Processing textbook lay wedged between a copy of Curtis Roads' Microsound and a half-consumed makerbot filament case. I brought my cat Hexadecimal and let him poke around in all the boxes of miscellanea. People were walking around taking care of fifty different things. The space was teeming with life. The faint smell of smoke and curry wafting in through the victorian windows.

I joined in at the table with matching laptop and hoodie. James got me up to speed with getting all the git submodules. The platform was already organized and ready to go, all I had to do was subclass someone's handy superclass and fill it with my artwork. It wasn't just that ofxUI was already installed – Reza was in the house. The project was very deeply OF. I was also pleased to meet Patricio Gonzalez Vivo for the first time. Gmunk paid us a visit. I believe Grey Area Foundation was resident down the hall in the same space at the time. I think the whole project is a miracle.

AutoTrader

Another project with Gmunk, I did an AutoTrader spot at Black Swan that featured a generative neuron network. I got to write a voxel volume exporter for closed geometry (using even-odd ray casting), and more importantly, I started exporting my art to OBJ and FBX so that it could be rendered in a 3d app. It was amazing to see my work with glass refraction. I also got to implement mouse-selection for the nucleii, and discovered a novel way to generate a circle with even segments along a straight axis using no trig functions (more on that later). Black Swan was an amazing new company to work at because it was a dream-team made from all the beloved members of our previous projects. I hope they do really really well. Proper respect to Matt Winkel, Nick Losq, Jake Sargeant, Chris Clyne, and Jacob Glaser.

Verizon FiOS

CHARLEX requested different kinds of wormhole animations for a TV commercial about Verizon FiOS. Video elements would be placed inside the wormhole as it progressed towards the O in FiOS. As I worked remotely, my on-site counterpart was Fabian Tejada who learned to tweak the app and produce renders from it. Fabian was wonderful to work with.

Tron Legacy

I spent a half year writing software art to generate special effects for Tron Legacy, working at Digital Domain with Bradley "GMUNK" Munkowitz, Jake Sargeant, and David "dlew" Lewandowski. This page has taken a long time to be published because I've had to await clearance. A lot of my team's work was done using Adobe software and Cinema 4D. The rest of it got written in C++ using OpenFrameworks and wxWidgets, the way I've always done it with this team ;) Uniquely however, Digital Domain's CG artists were able to port my apps over to Houdini for further evolution and better rendering than OpenGL could ever provide. Special thanks to Andy King for showing me that what seasoned CG artists do at DD is actually not so far off from what's going on in the Processing community.

In addition to visual effects, I was asked to record myself using a unix terminal doing technologically feasible things. I took extra care in babysitting the elements through to final composite to ensure that the content would not be artistically altered beyond that feasibility. I take representing digital culture in film very seriously in lieu of having grown up in a world of very badly researched user interface greeble. I cringed during the part in Hackers (1995) when a screen saver with extruded "equations" is used to signify that the hacker has reached some sort of neural flow or ambiguous destination. I cringed for Swordfish and Jurassic Park as well. I cheered when Trinity in The Matrix used nmap and ssh (and so did you). Then I cringed again when I saw that inevitably, Hollywood had decided that nmap was the thing to use for all its hacker scenes (see Bourne Ultimatum, Die Hard 4, Girl with Dragon Tattoo, The Listening, 13: Game of Death, Battle Royale, Broken Saints, and on and on). In Tron, the hacker was not supposed to be snooping around on a network; he was supposed to kill a process. So we went with posix kill and also had him pipe ps into grep. I also ended up using emacs eshell to make the terminal more l33t. The team was delighted to see my emacs performance -- splitting the editor into nested panes and running different modes. I was tickled that I got emacs into a block buster movie. I actually do use emacs irl, and although I do not subscribe to alt.religion.emacs, I think that's all incredibly relevant to the world of Tron.

HexVirus is a spherical map of the globe that features vector outlines of the continents. These continent vectors are slowly eaten away by a more hexagonal representation. Algorithmically, this is a path stepping function which looks ahead for the closest matching 60-degree turns. The HexVirus globe was used in the executive board meeting scene, and also inside the grid as a visual aid in CLU's maniacal plan presentation. In the board room interface, the globe element is surrounded by the lovely work of my team.

The scoreboard was the first element I worked on. I created a line-generator that produced bursts of lines which turned at adjustable angles. The line generator had "radial mode" which arranged the geometry in concentric circle form. This line generator was used to generate generic elements and layers of style in different things, and is a GMUNK favorite. At this point, I found myself moving to multisampled FBOs because the non-antialiased polygons were just too ugly to work with, and we needed to make film-resolution renders. In fact, this is the highst res I've ever seen my apps render.

Fireworks, mmmm. I started with a regular physics simulation where a particle has an upward force applied at birth, sending it upward while gravity pulls it back down resulting in a parabola. I then added particle-children, followed by various artistic styles, including what our team has called "egyptian" across several jobs -- which is a side-stepping behavior. We were trying to create fireworks that looked enough like real fireworks but had interesting techno-aesthetic. As a homage to the original Tron character Bit, we used icosahedrons, dodecahedrons, and similar. I was disappointed that Bit isn't in this one. After doing this simulation, I've grown more aware of how often fireworks are used in movies.

For the portal climax, the TronLines app was used, but also apps like "Twist" from our team's previous jobs. Once the look was mocked up by gmunk, a houdini artist recreated the rig for deeper control.

I wrote a particle renderer that could make the head holograms slurp in and out of the data discs. Special thanks to Keith Pasko for CLUing me in about using exponential functions to create a sliding-gooey sort of delay.

When fixing Quorra, there was an element in the DNA interface called the Quorra Heart which looked like a lava lamp. I generated an isosurface from a perlin-noise volume, using the marching cubes function found in the Geometric Tools WildMagic API, a truly wonderful lib for coding biodigital jazz, among other jazzes. The isosurface was then drawn along different axes, including concentric spheres. The app was mesmerizing to stare at.

After this project, I was fed up enough with wxWidgets and Carbon that I was ready to author my own OpenGL based UI. The most important thing I could use is a floating-point slider. I also got irritated with the way the Carbon sliders would not slide all the way to the minimum and maximum values. It totally messed with my zen thing. Also, after a job like this, it's clear that a member of the Processing community working within a CG community is greatly restricted by the differences of realtime graphics rendering engines, and that probably messes with an art director's zen thing.

"TRON: Legacy" © Disney Enterprises, Inc. All Rights Reserved.

LG Optimus Launch Conference

I worked with yU+Co on this spot featuring a globe of dots connecting up. I was very happy to see how creatively they composited the renders - using my simulation in ways I never expected. The system was easy to write since it resembled older work. It was great to collaborate online with yU+Co for the first time. I appreciate how tech-savvy everyone was, and I think that made the pipeline pleasant. I'm also grateful that everyone understood and was sympathetic to me being at Beit T'Shuvah.

IBM Data Baby

This project required a lot of research demo programs. The job holds the new record for most code artists (8) hired on one MTh job. Our apps began receiving animated curves from maya, we introduced a new speed-optimized OBJ sequence file format, and we continued to accumulate maya export scripts. At the request of director Kaan Atilla, I managed to write a bunch of C++ After Effects plugins with names like [FishBall, Stripes, SchizoPath, MeshSpikes, CurveConnector], but in the end I settled back into OpenFrameworks and wxWidgets because when you compete with an Adobe app for internal resources, the Adobe app wins. I'm also disappointed in Adobe's quality of documentation and examples. I was put in a 'lead code artist' position and I feel like I handled myself better this time. We learned a whole lot! Shout outs to new algo-collaborators Jeremy Rotsztain and Tim Stutts.

Buick - Behind The Beauty

Stillshot from the TV commercial

Stillshot from the TV commercial

For this project, we tried a handful of intelligent particle techniques. One of the approaches was to have me write a renderable simulator in OpenFrameworks that could fuse our specific mix of generative diagrams, numbers, shapes, and line-art into a magical fantasy breeze. The app was able to work with Motion Theory's special rotoscope and compositing pipeline, and this was really the first time I started using wxWidgets there. Because of that, the app featured enough user interface that a non-programmer could be productive with it. Four Adobe-savvy designers were able to run my app and art direct their own shots.

Mekanism: Need for Speed Undercover flash

I worked at Mekanism in San Francisco for a couple months mostly programming in Actionscript3 Papervision for a few things. The front page of "Need for Speed Undercover" is a papervision panorama with interactive elements. I'm impressed with papervision but it still has all the eccentricities of flash-based anything. It was easy to get into from having been so into OpenGL all this time. There were plenty of example apps to show the art director.

LG Advanced Learning

My part in this project was inspired by a piece "pinch" which i had been emailing around at the time. I worked with Jake Seargeant, 3D artists, and director Carl Erik Rinsch at Digital Domain to create the light effects for this commercial about little robots who have a party while the owner is away. I was able to write several OpenGL applications in C++ which allowed Jake to tweak parameters and render the frames. Giving Jake this amount of control made my life more convenient and gave us more creative options.

Gatorade - Inside Crosby

For this rich scene of what's going on inside an athlete's head, Mark Kudsi had me writing C++ code to generate slowly growing neurons into a stretched screen 'topiary'. An adhoc rendering cluster was used to quickly render random seeded versions as we evolved the style into the fantasy vision displays.

HP Paulo Coelho

For Paulo, finger trails of personal photos dissipate like horse hair underwater. My C++ JImage object is born - something that allows pixel addressing that can also update its display list cached textured unit rect. I later based a hair style on this aesthetic. It was a pleasure to collaborate on this spot because everything was so warm and natural.

screenshot of mouse toy stretching a woman's back

screenshot of mouse toy stretching a woman's back

Modest Mouse Dashboard

This music video turned out to be a unique family effort, rewarding in the end. A couple of the island shots feature the pop celeb's face rendered by our custom after effects plugin, Pixel F. Meanwhile indoors, Gabriel takes a simple springy spider web I had set up and activates it artistically using pointer bugs and interpolation error. The JChain addiction is born. Special l33t props go out to Gabe as well for his MEL fish fin wave behavior!

Bud Select - Just a Game

For this superbowl spot, I had another amazing collaborative coding experience with Gabe Dunne. The approach was different in high level regards. Our generative renders were being used as concept for other artists to flesh out into several shots and variations. We were also coasting on a sufficient body of pre-written code that pulling up old building blocks and combining them experimentally was at our fingertips. Because of path smoothing politics residing from previous projects, we start the JPath object, featuring smooth(int); This class will change the future. At least one project into the future.

wind graph

wind graph

shuffling cards interface

shuffling cards interface

voice volume indicators

voice volume indicators

Nike 'One'

Motion Theory's workflow of the Nike 'One' commercials did not just benefit from an applet programmed by one of the team members. This time, the core workflow became a team of four visual programmers risking carpel tunnel to generate a diversity of floating engineering graphics using every trick in the book. Although the production process was organic and artistic, the team collaborated well - subclassing a common object oriented super class, using common (custom) rendering frameworks, establishing file format protocols for shared data, and using versioning systems. The result was complex yet delicate and tasteful swarm of diagrams and math floating around the heads of thinkers - interacting with the physical, emotional, and narrative surroundings. A demo applet was also published in Processing.org exhibitions. Processing Artists:

HP Pharrell

had the great honor of doing (more) algorithmic particle artistry with Motion Theory to produce this stunning new HP commercial, for their "The computer is personal again" campaign. Spinning spools of typographic smoke, shaking the pixels off shoes, and of course, a swarm of gratuitous abstract cool stuff -- these composite effects were animated primarily in Processing code with reinforcement C++ coding when needed. Thanks to Gabriel Dunne for doing some of the satellites. Unlike Nike 'One', this was quite a chromatic job. I have since gained a personal relationship with the colors magenta (FF00FF) and lime (00FF00). Special thanks to Mark Kudsi and Mathew Cullen, whose talents were absolutely essential to this wonderful project. This is what baby looks like.