The Lion King

The Lion King
How do you categorize Disney’s 2019 feature film The Lion King? The African savannahs look photographed. The lions, wildebeests, meerkats, hyenas and other animals look like they stepped out of a BBC documentary. It has a live-action look and feel. And yet, these animals talk. And they and their environments are CG.
 
Disney’s wildly-popular and highly-acclaimed 1994 animated feature The Lion King spawned a Tony award-winning musical stage adaptation, theme park attractions, sequels, spin-offs and video games. But to call this year’s adaptation a “CGI remake,” although technically accurate, oversimplifies the filmmakers’ accomplishment.
 
 
One thing is certain: The Lion King 2019 would not have been plausible just a few years back. Disney’s 2016 The Jungle Book came close, but Jungle Book and Disney’s other remakes that followed starred live-action actors. There are no live-action actors in The Lion King. With the exception of lighting information captured from the real world and photoscans used for some environmental models, there are no real-world elements in the film. Modelers referenced photography of the animals in their natural environments as they created the CG characters, but they didn’t scan animals, and no animals were motion-captured. The film was created by hand, by visual effects artists.
 
Visual effects supervisor Rob Legato, animation director Andrew R. Jones, and MPC visual effects supervisor Adam Valdez, who received Oscars for The Jungle Book, joined Jungle Book director Jon Favreau for this film. Also returning for The Lion King were visual effects supervisor Elliot Newman (who was a CG supervisor on The Jungle Book), environment supervisor Audrey Ferrara, and many other artists on MPC’s crew.
 
“The work all happened under the MPC umbrella,” Legato says. Around 1,250 people at MPC worked on post production in London and Bangalore. Of those, 130 animators from 30 countries created the animals.  
 
As with The Jungle Book, the crew used virtual production tools and techniques to previsualize the film, but stepped up the game for The Lion King by moving into virtual reality. For this film, the MPC virtual production team joined forces with Magnopus in Los Angeles to design and implement a new generation of on-set tools.
The Jungle Book was Version 1.0 of this approach to filmmaking,” Legato says. “We skipped over 2.0. The Lion King is Version 3.0. We previs’d the entire movie in VR and then shot it in VR as if it were a live-action film.”
 
 
Virtual Filmmaking 
 
The Unity-powered tools allowed several people to be in one virtual world together. 
 
MPC and Magnopus artists and programmers viewed the world through Oculus headsets. Legato, Favreau, Valdez, James Chinlund (production designer), Caleb Deschanel (cinematographer), and sometimes others could view a VR environment simultaneously through HTC Vives.
 
“It was a little weird,” Legato says. “You could see someone walking 100 yards away in the environment, but they were standing five feet away. You could click on their monitor and instantly transport to their location and explore. If I wanted everyone to see what I saw, they could look through my camera from my vantage point.” 
 
This was true whether they were scouting locations, blocking animation or shooting the film in VR. Jones provides an example of how that worked for the opening Circle of Life sequence. Scar, the villainous lion voiced by Chiwetel Ejiofor, is in a cave when we first meet him.
 
 
“Mufasa gives him a little flak,” says Jones, referring to Scar’s brother, the king of the Pride Lands, voiced by James Earl Jones. “At one point, we had Scar exiting through a back entrance of the cave, but Jon [Favreau] thought it might be better to have him come out the front. Jon made that decision while looking at the scene in VR.” So, still in VR, Jones moved the lion into the new position, and Deschanel framed a new shot.
 
To make that possible, Ferrara’s team modeled and rendered digital sets at a resolution appropriate for realtime interaction. The sets provided locations to scout and surfaces on which animators could place animals. A group of animators under Jones’ direction created previs-quality animation for the VR shoot. Actors voicing the characters were recorded. 
 
“Basically, we do a location scout, and once we figure out where things go, it starts a whole series of things,” Legato says. “We might send the location back to the art department to make a bigger hill, and meantime, the animators are creating the action where it will play. Then, the scenes come back to the camera stage to be photographed. We have recorded tracks of the dialogue, but sometimes use temp tracks — it’s free-flowing. As we’re shooting, Jon might be recording the actors. It’s like we’re creating a stage play and then we shoot it. We add the camera moves and do all the things we’d do for a live-action movie. We shoot assets and animation as if it were a live movie. What we’re photographing is not photoreal, but it feels so because we can walk around, take a step.”
 
In addition to powering a 360-degree view of the location with animated animals, the game engine could render shadows and the reflectance of bounce light. Deschanel could move lights within the scene. 
 
Africa
 
Before any sets were built or animals animated, in March 2017, supervisors Ferrara, Jones, Legato, and Valdez, along with production designer Chinlund and director of photography Deschanel, spent two weeks in Kenya.
 
 
“It was an incredible opportunity to see filmmakers in their element,” Ferrara says. “To see what type of things they liked. When we returned, James [Chinlund] made a selection of locations he liked, and I put together an itinerary with an MPC team to photograph those locations.” 
 
As a result, environment artists had a library of more than 240,000 photographs and 7,000 videos to reference for the environments. Animators had a total of 42 hours 53 minutes of reference footage captured in the Serengeti, from a trip to Disney’s Animal Kingdom in Orlando, FL, and from nature documentaries.
 
“We built 66 digital sets, which is fewer than Jungle Book, but they were so much bigger,” Ferrara says. “We have miles and miles and miles of vista for the savannah, and we have no matte paintings. When you have only a few trees, you can tell if you’re duplicating trees. So, every tree is CG, even if it’s 150 kilometers in the distance. And then we have the opposite. When the camera is focused on a lion cub’s tail, we’re at grass level with pebbles and details on the ground.”
 
MPC’s R&D department created new tools to manage and optimize the enormous amount of geometry, add weathering to rocks and help with creating landscapes procedurally. The sets covered 150 square kilometers.
 
“But, I have to say that we did a lot manually,” Ferrara says, “especially the big features, the trees and rocks.”
During pre-production and production, Ferrara moved from London to Los Angeles to supervise work on the game-engine-ready sets built for the previs and the VR shoot. Once Favreau approved shots, the low-res shots moved to London to be up-res’d.
 
Newman remained in London while Valdez and Ferrara were in Los Angeles to receive the packages of virtual production scenes — Autodesk Maya scenes, virtual production assets, animation, characters and the previs sets — and translate the previs into production assets and rigs. 
 
 
“We’d get simplistic scenes designed to move quickly,” Newman says. “They were effectively like previs scenes. But, instead of getting third-party QuickTimes to match, we had real 3D data. We could track the assets — what species of tree, which character was in the shot. We’d know that in this shot there were 20 wildebeests. We learned from Jungle Book how important it is to track changes. We had that information shot by shot. In this shot, the light and trees are here; in that shot in the same location, the light is over there. We had tools to track that.” 
 
Ferrara adds, “We could capture everything. Not just the sets and animation, but the camera and lighting, and each filmmaker’s input. This was all in the package shipped to London. We didn’t have to search for how many meters it was from here to there, or where the animals would walk. All those questions were resolved before the sets arrived in London for post production. Being the only vendor on the show really paid off.”
 
Previs Animation
 
Animation moved through the process in a similar way, that is, from rough blocking with low-res characters in VR, to more refined movements shot in VR that led to approved edits, before moving to post production for final animation and high-resolution rendering with muscle and hair simulation. 
 
The process began with Jones and a team of previs animators who created, first, a previs-level pass based on storyboards from the story department, a rough blocking pass, and then Favreau-approved animation ready for the camera stage. 
 
“Usually we give previs animators individual shots,” Jones says, “but for this process they had a chunk of work — :30 to a minute of animation that represented the flow of the scene so Caleb [Deschanel] and Rob [Legato] could put the camera anywhere. Some were even longer — four minutes. It was pretty crazy.”
 
Fifteen animators worked on previs in MPC’s London Studio, and another eight to 10 were in Los Angeles. Often the scenes were readied in London and adjusted in LA.
 
 
“Our process was similar to that of a live-action movie where you have actors on stage delivering dialogue to block a scene, then you set the camera position,” Jones explains. “We would block the scene so any camera position would work, then we could tweak it.”
 
The animators gave the animals walk cycles so VR viewers could move them with joysticks. But the rocky terrain often required separate animation.
 
“Jumping down off a rock was too awkward,” Jones says. “It would detract from the storytelling if you could feel the game engine.”
 
Previs animators working by the stage could react quickly to notes and make adjustments. If Favreau wanted something different after the edit, the crew could pull up the files for a reshoot in VR.
 
“We’d wait to turn over sequences until we had a complete edit that Jon signed off on all packaged up with the 3D data,” Jones says.
 
Creating the Performances
 
Jones had already faced down the issue of lip sync on realistic animals for Jungle Book, and many on the crew also knew Favreau’s preferences from working on the previous film. 
 
“Obviously, we don’t want the animals’ lips to flap,” Jones says. “Jon wanted the lips to never move unnaturally. He didn’t want to see phonemes and pucker shapes that lions could not do. He said it’s better to have people look at the eyes and take in the performance than to have them stare at distracting lips.”
 
That meant the animators had to create a dialogue performance without using mouth shapes. 
 
“We discovered we could do a lot with articulation,” Jones says. “If you get the muzzle right and the corners of the mouth moving at the right time, you buy that it’s real. Some actors don’t even move their lips much.”
 
Two animation supervisors, Gabriele Zucchelli and Stephen Enticott, and six lead animators working under them, each managing a seven-animator team, created the performances in London, Bangalore, and LA.  Each lead was assigned one or two main characters to help develop the rig and walk cycles that animators would use as a guide. 
 
For reference, the animators had live-action footage shot in Kenya and at Animal Kingdom. They also filmed some voice actors delivering dialogue. But not for lip sync.
 
 
“We wanted to see the spontaneity between Seth Rogan [Pumbaa] and Billy Eichner [Timon],” Jones says. “We didn’t block the movement, but we could find the feelings between the warthog and meerkat. We could see moments of eye contact, and the comedic timing between them.”
 
More generally, animators would look at the actors delivering dialogue and then try to find clips with animals having the same attitude. 
 
“We even looked at slow motion,” Jones says. “Jon wanted his film to feel as real as a BBC documentary, but with animals that talk. A lot of times, the BBC doesn’t shoot in realtime. They often shoot animals walking slower, and we get used to that. It gives the lions weight and presence. And, that was also especially important for the hyenas. They’re constantly looking around, ready to dart at a moment’s notice, but we needed the opposite for most of them. We needed the queen to feel strong. We also slowed down Rafiki, the mandrill. Mandrills are small and tend to move fast.” 
 
New Tools
 
In addition to new tools developed for the virtual production, the technical teams at MPC created and improved tools for post production — in particular, a hair shader. 
 
“We looked at the available fur shaders but decided to develop our own using the new technology in RenderMan,” Newman says. “We researched the color of hair, making sure artists stayed within the realm of natural pigmentation for the animals. We tried to understand from a microscopic level how the structure of hair strands varies for different species of animals. We knew Jon wanted to push the bar with everything.”
 
The technical teams gave the animators faster rigs with “limb lock” to make elbows rigid and give shoulders more play. They also evolved the muscle simulation tools for better skin sliding and to include collisions with bones. 
The animators performed 63 different species of animals with 365 unique variations. Seventeen were hero characters. The animals interact in intimate moments and stumble, fall and swerve over each other in stampedes. A crowd system developed in SideFX’s Houdini helped the team lay out the background action.
 
“The first thing we tackled is a 15-minute sequence of Simba being chased through a canyon,” Jones says. “Getting the wildebeests to feel right was difficult. We probably had 300 or 400 in one shot. We didn’t want only run cycles, so we built an extensive library of turning, falling, stumbling movements that we could stitch together. The crowd system helped with the background, but animators took over the foreground. The canyon is rocky and has ledges, so the animals had to catch their weight as they stumbled over rocks and other beasts.”
 
To light the skies over the vast savannahs, the crew used the sun, something they couldn’t do with Jungle Book, which was a stage-based shoot. 
 
“For this one, we decided to be true to nature and capture real sun values and real shadows, so we could re-create that lighting in the rendering,” Newman says. “But, the light also had to be pleasing, so we worked a lot with Rob [Legato] and Caleb [Deschanel]. Rob might accent characters with the right type of light or put a tree in a shot to add the right shadow.”
 
Newman gives an example of a scene in which adult Simba confronts Scar, and a storm starts brewing.
 
“We developed lights to add shape lighting to the characters, and we put a texture of clouds on the light source so it didn’t look stage-lit,” he says. “But, everything we did was grounded. We always made sure we weren’t using something too glorious; that every sky wasn’t captured at a perfect angle. On a live-action shoot, you might run out of light, but you have to keep going. We tried to create those imperfections. The biggest challenge was managing the rendering. I calculated that if one machine rendered this movie, it would take centuries. So, the logistics of rendering the show — which shots to render first, when to load them onto the farm — was a big one for us. Every frame was so complex. We were doing every single shot full CG.”
 
 
Looking Toward the Future
 
“This is a project of firsts,” Ferrara says. “The first time shooting every shot in VR. The first photoreal computer graphics to the degree that Jon wanted — to have everything look like nature and real animals.” 
 
It is, in fact, the first photorealistic CG feature filmed to look like a live-action nature film. 
 
“It improved everything,” Legato says of the ability for the filmmakers to shoot CG characters and environments as if they were live-action scenes. “The photorealistic depiction of Lion King feels like a live-action movie because we used every tool of live action. We were luckier, though. We could move the sun and create a live-action feel. We didn’t have to wait for the right time to shoot. It’s magical. It’s the contribution of everyone that makes the whole feel correct. The technology allows us to further the art form.”
 
Barbara Robertson (BarbaraRR@comcast.net) is an award-winning writer and a contributing editor for CGW, Post’s sister publication.