January 26, 2006

Pixar, Firefox and AOL Explorer

So I meant this to be more timely, but whattya gonna do?

This is really more a story about software developers and our mindsets (good and bad) than it is about Pixar, per se, but as I said, its
almost timely :)

A few years ago, a few friends and I were at SIGGRAPH, and we attended the Ray Tracing Roundup (yes, yes, I know, we are tres cool - you know you wanna hang with us). This was still in the early(ish) days of 3D HW acceleration; it was around and games were doing a lot with it, but it was still new and a lot of the discussion was about software v. hardware - scanline renderers v. raytracers, etc.

The folks from Pixar, of course, were the dudes - they did a lot of pontificating, mostly on point, but definitely were the center of attention. At some point someone (else) argued that software stuff was more flexible, allowed for more differentiation, more tricks and specialization etc. (These days, of course, its all software - just a question of which processor in your PC is best for the task at hand ;P)

The Pixar guys went into a big long diatribe about physical correctness, the importance of research and specs and physics and light - blah blah, and - in support - some guy from
ART started describing how they had followed the Pixar spec, had implemented a raytracer in HW, that procedural graphics could be HW accelerated, etc - kind of a Turing complete argument for 3d.

Poor guy.

If there's one thing engineers probably dislike more than being disagreed with, its being agreed with :)

So one of the Pixar guys - no names required - truthfully, we've all been there; stick with me - you'll see... so one of the Pixar guys then proceeds to just dismember the guy. They've learned so much over the last 10 years, graphics has come so far, the spec is stale, what were they thinking (kinda just generally missing the point - which was NOT about specs IMHO but programmability - but that's ALSO neither here nor there).

This continued for a long enough period that I still recall the increasing embarrasment we all felt for the ART developer...

And then, from the back of the room, a somewhat quiet, tad bit nebbishy guy stands up (who turns out to be
Dan Ward - sorry Dan :)) - and says the following:

"If I understand correctly, what you're saying is that over time things change and progress - we all learn a lot and grow, but the one thing that stayed constant is that you're still right."

HAH - point. The room exploded - it was fun....

That's kinda what all the Firefox arguments remind me of... yes, its nice to have access to the code; yes its a nicer engine in a lot of ways, but the question is: is it really better in ways that still matter?

I'll dive into the questions (security especially) in the near future, because I know what a lot of my tech brethren think, but my main point was this -
oy and yeesh - relax with the rhetoric already :)

All that aside - I am glad it exists, and that we (royal AOL we here) helped pay for it; keeps
everybody else honest.


Kilroy Trout said...

I wonder if Pixar’s obsession with the “physical correctness” of their rendering model explains the blurred, over-lit nature of much of their material. Things that are supposed to be plastic (e.g., “Buzz Light Year”) look plastic and things that aren’t look…uhhh kinda plastic. RenderMan is all about convolving the hell out of every pixel, minimizing temporal aliasing and noise. It’s needs precisely *more* of this to be compelling – where is the grit and grime in 3D rendering?

I’ve read it takes six hours for Pixar to render a frame, which begs the question, as you suggest, of programmability but also of paradigm. With so many calculations per pixel, you might want to examine your assumptions.

…but then I’m just pontificating.

Sree Kotay said...

kilroy, there's another great pixar rule they call the "10 minute rendering rule" - basically, no matter how much they optimize or how much faster computers get, a single frame always takes 10 minutes: because the artists do MORE with the power, not FASTER.

(actually, I can't remember what the actual render time was, but its not important)

Anonymous said...

having worked at pixar for years, i can unfortunately confirm that it has its share of pontificating egos, who unfortunately like to bask in the glow of assocation and play the part of hotshot on stage. but there's also plenty of us just trying to learn and do stuff.

to be honest, computer graphics is a total hack-fest. physically-based models are only important in as much as they give good-looking results. there's tons of deliberately un-realistic stuff going on in every frame, all in the service of giving the director the look he wants. it's a lot easier to achieve a look by tossing in physically-implausible overrides, color gradients, etc. than by tweaking inputs to a simulation.

as for everything looking plastic, stuff like "final fantasy: the movie" certainly shows that CG can achieve a gritty look. but pixar is looking to make cartoons, not blade runner.

the "10 minute rule" is more often just "we eat our gains" -- and it's the same phenomenon as goes for all software. computers/etc. get faster, so we have more breathing room to use abstractions, indirection, extra detail, etc... and end up back near the pain threshold.

Sree Kotay said...

Hey, yeah - every company's got their egos - but Pixar's earned more than enough cred that its ok :)

Ironically, I don't think the views expressed were wrong - to your point, "physically correct" rendering is just a different style of hack.

Still, style counts :)

Anonymous said...

ダイエット ヨガ
ビヨンセ ダイエット