With the recent news that Pixar is to revisit some of its best-loved franchises, announcing sequels to The Incredibles and Cars, it’s fair to say that animated movies are still very much big business in Hollywood.
To create these films and bring the characters to life takes a mountain of hardware and software that’s changing at an incredible pace – something TechRadar found out when we spoke to Greg Estes, Nvidia’s vice-president of marketing for the Professional Visualisation and Design business.
Nvidia has been collaborating with Pixar for a number of years now. Recently, when it was making Monster’s University, the animation studio used its 12GB Quadro K6000 card ahead of release as it offered up the power needed to use Nvidia’s Optix real-time raytracing engine.
Ray of light
Raytracing is a technique that helps animators simulate how light sources will look when objects encounter one another.
Pixar upped the raytracing game with Monster’s University by collaborating with lighting application Katana, mixing it with its Renderman technology for the ultimate high-end rendering and lighting solution.
This package also made use of Nvidia’s Optix raytracing engine, with Estes noting: “Pixar worked with an interactive lighting tool in their pipeline so they could take a scene and instead of tricking the system with artificial lights, they could do interactive lighting in real time within the tool. It is just stunning.”
YouTube : https://www.youtube.com/watch?v=LACmRpMYOak
But as Estes explained, Pixar and other effects companies are always looking to improve on what they have made and raise the bar for the films that follow. One area of animation earmarked for improvement is the combination of character movement, lighting and wind, and what this does to the look of a character’s hair and clothing.
“This can’t be done today but the direction of where things are going is that animators want to have within a simulation the wind blowing and the light on,” said Estes.
“They want the wind to come in and the character to move and simulate the hair or the fur at the same time they are doing the rendering and the lighting.”
While technology isn’t quite there at the moment, Estes predicts that we really aren’t that far away.
“It isn’t all possible in real-time today but that is where they want to get to. The artist wants to have someone walk through a scene and they can bump into somebody else and animate the movement and then see exactly how it is going to come out.”
As for the future (or the future’s future), it seems Pixar and its ilk won’t ever be content.
“We are getting very close to this but as soon as we can do this with one character, they are going to want to be able to use for three characters,” laughs Estes.
“There is kind of no end.”
If you want to learn more about what Pixar and Nvidia are up to, they will both be at the GPU Technology Conference from 25 March.
- Pixar: we choose characters based on story, not technology