Presenting “The Tree of Life” to The Academy

The awards season earlier this year was a very tense time.  I never felt comfortable posting about it because I didn’t want to toot my horn too loud.  It was all a little too big.

“The Tree of Life” movie poster, care of Wikipedia.

This past January, I was honored to present “The Tree of Life” to the VFX Branch of The Academy at their yearly bake-off.  We were one of 10 films singled out and honored to be competing for the nomination for The Academy Award for Best Visual Effects.

I wrote a short three page piece that was part of our overall written submission.  I was up on stage at The Academy Theater, along with Dan Glass and Mike Fink to answer questions after our highlight reel was screened.  I even had the opportunity to quickly answer a question in the very short 2 minute Q&A that followed.  To be clear:  I was personally up there for nomination for an Academy Award.  Which is an honor and moniker that really has so overwhelmed me, that I’ve kind of avoided flaunting it to the extent I probably should have.  How do you do that without just seeming like a totally self absorbed jerk?

Ultimately, we did not make it to the next phase, which was the nomination for the “Best Visual Effects” award itself.  However, the film did get nominated for “Best Picture.”

At this point, I’m very comfortable with the reality that our “Universe” sequence is a primary component of the film.  Were it removed, it would greatly change the character and value of the film overall.  I’m also very comfortable claiming both creative and technical authorship of our piece along with my collaborators, Dan Glass, Mike Fink and Doug Trumbull, under Terry’s direction.  And I couldn’t be prouder of our team and work.

I spent over two years working directly with Terry, in Austin Texas, on the project.  I was afforded the rare opportunity to take the “Universe” sequence from Terry’s words on the page, through creative development, and to final composite and conform.  I’m not sure I’ll ever be that involved in a project in such totality again, unless its my own.

As with any creative project I get that deeply involved in, there are all kinds of things that I’d want to change.  It’s never finished.  It just happens to be done.

Tree of Life Trailer Released

The trailer for The Tree of Life is out and about.

I spent over two years working on this film for Terry and couldn’t be prouder of its impending release.  I will receive a Digital Effects Supervisor credit on the final film.

I am also extremely proud of the VFX team as a whole and the work that has been accomplished.  I am greateful to Dan Glass, the Visual Effects Supervisor, for the opporunity to work with him on such a worthwhile piece.

Terry is famous for being tight lipped and letting the work speak.  I’ll honor that stance.

Stop reading.  Go watch.

New focus on FIE

I have left my role as the Director of Software and Pipeline at Method Studios, in order to focus on FIE.

This is a really exciting time for me and I’m looking forward to dedicating myself full time to technology and consulting under the FIE banner.

Method and I have discussed a continuing consulting arrangement and hopefully we will continue to work together in a new capacity going forward.

Staff at Method Studios

Actually, this is old news.  As of January 2010, I took a staff position at Method Studios.

For FIE, this means the consulting and software facets of my company are going into a bit of a frozen state.  I will continue to develop software and tools in some capacity outside of Method, but will limit that work to open source tools most likely.

The creative development half of the company is still moving forward however slowly…

VFX World reports on the Iron Man previs effort

VFX World has published an article on previs, in this summer’s blockbuster movies.  Leading off the article is Iron Man.  Kent Seki, the previs and HUD effects supervisor gives a good window into what we did and how we did it.  He also details quite thoroughly the tools and techniques I developed for the film.  I’ve also detailed a similar account on my blog a few weeks ago.

There are a number of videos embedded in the article.  However this one is a surprise.  This is one of my range of motion tests from early in the suit design process.

Hulk Smash Previs

PLF’s short summary of the effort on “The Incredible Hulk” is a good primer on what was done for the film.  My good friend Erol Gunduz did a great job of stepping up and taking the main TD burden on the job.  My contribution was in the areas of training, tools development and some Mocap supervision in Toronto.

Iron Man Released

Iron Man finally hit threaters and so I finally get to talk about it in the open, and what I did on it.

If you watch the credits, under Pixel Liberation Front, you’ll find me crediated as Brad Friedman, Technical Artist.  This is actually a mistake.  My credit is supposed to be Technical Lead, as my supervisor submitted it.  I’ll assume it was a transcription error and not a willful demotion.  If I’m really lucky it might be fixed for the DVD release, but that’s pretty unlikely.

So, what exactly did I do on the film?  What does previz do on films like that? How does it relate to FIE?  How will it help my clients?

Well, I’ll answer the last questions first.  On Iron Man, I developed a brand new previsualization pipeline for Pixel Liberation Front, in Autodesk Maya.  PLF has, in the past, been primarily an XSI based company.  Though they have worked in Maya before, its usually for short jobs (commercials and the like) or working within someone else’s custom Maya pipline (such as working with Sony Image Works on Spider Man 2).  This was the first time PLF was going to really take on a modern feature film as the lead company, completely under it’s own sails in Maya.  And it was my job to make it possible.

The tools and techniques I developed were the backbone of the previsualization department on Iron Man for the two years PLF was actively on the project, even after I left the film.  The pipeline also moved over to “The Incredible Hulk” as we had a second team embedded in that production.  I’ve actually been talking to PLF about further expansion and will probably be expanding their pipeline in the coming months for upcoming projects.  All those designs and ways of working are currently being updated and reimplemented into my licensable toolkit.  They will be available to my clients.

I’ll get into the specifics of the tools and designs as I describe the challenges of the production further.

When we were first approached by Marvel and the production, one of the requirements for being awarded the job, was that we work in Maya.  We quickly evaluated if we could do it.  And the answer was “yes.”  Very quickly we were awarded the job and we sent our first team to the production offices, embedded in Marvel’s offices in Beverly Hills.  The team consisted of Kent Seki, the previsualization supervisor, Mahito Mizobuchi, our asset artist and junior animator, and myself, the technical director and artist.

My frist major task was to help the designers in the art department with the Iron Man suit design.  At the time, there was no visual effects vendor.  And the hope was that I could do Visual Effects development and vetting with the art department, avoiding the need of hiring the VFX vendor early in the production.  John Nelson, the VFX supervisor was particularly concerned with the translation of the designs on paper, to a moving 3d dimensional armor.  He didn’t want obvious interpenetrations and obvious mechanical imposibilities to be inherent in the design.  He wanted the range of motion evaluated and enhanced.   He needed us to be able to build the art department designs in 3d and put them through movement tests.  He needed us to suggest solutions to tricky problems.  All that weight fell on my shoulders with the assistance of Mahito’s modeling skills.

For a number of months, we rebuilt the key trouble areas again and again as the design was iterated back and forth between us and the art department.  The abdomen and the shoulder/torso/arm areas were where we focused most of our time. I rigged a fully articulated version (metal plate for metal plate) of the trouble areas about once a week, trying to solve new and old problems different ways each time.

Finally, we arrived at what has stuck as the final articulation and engineering of the suit after a few months of work.  The suit has gone through many superficial redesigns and reproportions since then.  However its mechanical design was sound and survived all the way to the screen.  The design eventually went to Stan Windston Digital for further work and to be built as a practical suit.  It eventually went to ILM for final VFX work, once they were awarded the show.

I think what this aspect of the project shows is a very smart VFX Supervisor and VFX Producer
figuring out how best to utilize a top notch previs team to save money and get experts in the right places, without having to pay a premium for a full VFX company early in the production process.  This allowed them to go through a full bidding process for the VFX vendors, without being rushed into an early decision, as their VFX needs were met by us.  To give you an idea of how early in the process this was:  Rober Downey Jr. was not cast at this point.  The production designer started work a week after we did.  The script was not complete.  The Director of Photography was not chosen.  The main villain in the film was NOT Obidiah Stain at the time.  This is really early and ultimately the final decisions on VFX vendors didn’t come until about a year later.

Once our initial suit design chores were handled, I moved onto the main previs effort and we started working on sequences of the film.

Previs has some very special requirements.  It has to be done fast.  It has to look good.  It has to be cheap.  I think its the most demanding animation project type there is these days.  On the triangle of neccesity, it hits all three points.  Fast, Good and Cheap.  Usually you can pick only two of those to accomplish at the expense of the third.

Historically, previs got away with skimping on the “Good.”  Characters would slide along the ground, making no effort to pump their legs.  It was pure blocking.  It was fine though, because the previs was a technical planning tool.  The aesthetics were irrelevent as long as the previs was technically accurate enough to plan with.  However, as previs has become more of an aesthetic tool and less of a purely technical tool, the quality of animation has had to increase exponentially.  It has gotten to the point that on a lot of shows, we’re doing first pass animation.  And in the extreme, even first pass animation is not enough to convey real human acting.  We are expected to be able to generate real human motion without giving up on speed or price.  Some might find this antithetical to the concept of previs (infact a lot of people do) but our reality as a previs company is that “the customer is always right” and if the customer decides they want it, we need to be able to provide it.

So to sum it up, I was faced with being the only experienced Maya user on the team.  I needed to hit all three points in the triangle of neccesity fast cheap and (exceptionally) good.

By the end of my time on Iron Man and Hulk, I created the following

  • Multi Rig Marionette System
    • One character can have any number of control rigs in a given scene.  The character can blend from rig to rig as the animator pleases.  Addition of more rigs and characters to the scene is handled via an integrated GUI.  Control rigs can be of any type and therefore can use any animation for their source, from hand animation to mocap.  Think hard about what doors it opens up.  A rig devoted to a walk cycle.  A rig devoted to a run cycle and a rig to animate on by hand.  All connected to the same character with the freedom to add as many other rigs as you need to make the scene work.
    • Accelerated skinning based on a goemetry influence system integrated into the Marionette.
  • Motion capture Integration
    • Accelerated pipeline for moving raw motion capture from motionbuilder to any number of previs characters in a batch oriented fas
      hion.
    • Motion Capture editing rig that takes mocap as a source and gives the animator an array of FK and IK offset controls with which to modify the mocap.
  • Auto Playblast/Capture/Pass System
    • A fully functional settings system that allows assets to be populatd with “terminal strips” and passes to be populated with “settings” for the terminal strips that control aspects of the asset via connections, expressions, etc.  Activating a pass applies the settings to any matching terminal strips in the scene.
    • Passes can be members of any number of shots in the scene.
    • Passes can optionally activate associated render layers.  In this way, I’ve incorporated the existing render layers system into my own (if the new render layers ever become stable enough to use in production, I might actually condone their use)
    • A shot references a camera, a number of passes and keeps information on Resolution, Aspect ratio, shot, sequence, version, framerange, etc.
    • Smart playblast commands capture the selected shots selected passes in shots etc.  One command and it sets up the playblasts itself and fires them off one by one.  Go get coffee.
  • Guide and AutoRig solutions
    • A custom biped guide allows easy joint placement and alignment.
    • A very comprehensive rig based on the ZooCST scripts is automatically generated from the guide.
    • Motion capture rigs suitable for movement into MotionBuilder and generated off of the guide.
    • Motion capture editing rigs suitable for use in maya are generated off of the guide.
  • An exceptional Goat rig
    • I made a really good goat rig.  Which didn’t really end up getting used.  But it was a spectacular rig.  Really.

Beyond the tools I built directly for the previs jobs, Iron Man also opened the door for us when it came to motion capture.  The production had scheduled a series of motion capture shoots with which to test out the suit designs and see how they moved.  We made it a point to capture some action for the previs while we were there. Our previs characters were already loaded into the systems since that was what the production was using to test our the suit design.  We used some of the mocap for the previs on Iron Man in a more traditional motionbuilder pipeline that was functional, but slow.  I immediately realized that mocap could be the solution to the, better-faster-cheaper problem with the bottlenecks sorted out with custom tech.  And I moved in that direciton.  Within a year, PLF had bought a Vicon mocap system.  I set to the task of making it work for previs and within the year, we were in full swing of doing previs mocap for “The Incredible Hulk” due out later this summer.

The remaining question is: do these technologies have relevence outside of Previs?   And the answer is, Yes.  All of these technologies were built to dual target previs animation and production animation. They’re generic in nature.  We have used all of these systems on finsihed render jobs, game trailrs and the like.  What I’ve built, is a better way to animate, that is rig agnostic.  All the old animation techniques and rigs still work.  I’ve just incorporated them all into a higher level system.  I then built a high volume motion capture pipeline as one potential animation source in that higher level system.

But it doesn’t stop there.  In reality, I only really got to develop about half of the systems and tools I felt should be a part of the pipeline.  In their new incarnations, the tools will be even more functional.

I saw the movie the other night and it was a real thrill to see all our shots and sequences finished and put together.  A number of my own shots made it through the gauntlet and into the final film, which was a real thrill.  More so than that, it was great to see people enjoying it.  I felt good about knowing that I had in a very real way, built a large section of the foundation for it.  My tools allowed Jon Favreau to work on the film in a more fluid and intuitive manner through the previs team, which really created an unprecedented ammount of previs at an astounding quality.  The more previs we were able to do, the more revisions and iterations Jon could make before principal photography.  And the film is better for it.

Mocap in Previs

This VFX World article on Motion Capture in previs features our work at PLF.  It also features a picture of me in a mocap suit with a sword.  Can’t beat that.

Alien vs Predator 2

PLF’s online breakdown of previs on AvP2 showcases our last minute previs effort and how it ended up on film.  Jonathan Roybal, Tanissa Potrovitza and myself put in some late nights late in the game to pull this off but its pretty clear from the breakdown that it was of significant use to the production, as it was pretty much shot as is.