'The Lion King' Set Visit: How Jon Favreau Reimagined A Disney Classic With Cutting Edge Tech
It's December 7, 2017, and I'm standing in the African savannah. Pride Rock looms in the distance. A single tree stretches toward the sky, which has an almost otherworldly orange hue. A massive rhinoceros suddenly rumbles by, just a few feet to my right. I look around for a while, taking it all in: I'm on the set of Disney's The Lion King.
But then it's time to move on, so I remove my virtual reality headset and step back into the real world. I'm standing alongside a few other movie journalists in an unmarked complex in Playa Vista, California, also known as "Silicon Beach." The offices of tech companies like Google, Facebook, Instagram, and Electronic Arts are close by, and while this particular building looks bland and uninteresting from the outside, one of 2019's biggest blockbusters is coming together inside. There's enough space in this single building for an art department, editors, designers, animators, screening rooms, and a virtual reality stage where director Jon Favreau can step into the world of his movie during production. It was Favreau's idea to renovate this facility in order to have as much of his team as possible under the same roof, upending the traditional filmmaking workflow and opening things up for an unprecedented degree of communication and collaboration.
During our Lion King set visit, we learned about how Favreau utilized virtual reality to make his movie, how his version of the film is different from the animated classic, and tons more. Read on to discover how The Lion King was made.
We began our visit by sitting down with a group of the film's producers, who told us about how the filmmaking team began researching this project. In order to immerse themselves in the locations, the team took a three week trip to Kenya, where they catalogued the "movement of an animal, the way the animal walks, stretches, blinks – it's backed up by extensive visual research," said producer Jeffrey Silver. That research continued in Disney's Animal Kingdom theme park at Walt Disney World, where the filmmakers could take even more photographs of the creatures that will appear in this movie. By the end, they literally had one million photos as reference material.
Once the research was done, the hard work began. This iteration of The Lion King was created in a groundbreaking new way. Executive producer Tom Peitzman (Mission: Impossible – Ghost Protocol) gave us a brief walkthrough of the entire filming process from beginning to end, explaining how everything begins with the script, which is translated into storyboards, which are then cut into animatics (animated storyboards). From there, they go to production designer James Chinlund, who creates detailed concept art based on the filmmakers' research and passes it to artists in the virtual art department, who begin to construct the sets in virtual reality.
Meanwhile, animators are developing another iteration of the animatics showing a more detailed progression of the characters' movements, and those animatics are incorporated into the virtual sets. That's when people like Favreau, director of photography Caleb Deschanel, and visual effects supervisor Rob Legato put on VR headsets and begin scouting for the shots and angles they'll need. Once Deschanel captures those shots and the editorial team has cut them all together, the result is sent to the visual effects vendor (in this case, a company called MPC) to add all of the intricate details we'll see in the final cut of the movie.
How The Lion King Was Filmed
The movie is filmed in a way Silver describes as "metaphorically live-action," shot on what's called a "volume" – a motion-capture equipped sound stage with black curtains lining the walls, and sensors and cameras stationed at strategic points throughout. (To see what a volume looks like, check out this video of how Thanos came to life in the two most recent Avengers films.) On The Lion King set, there are physical dolly tracks, an actual crane, and even a handheld Steadicam rig – but there aren't any film cameras attached to them. Instead, there are just sensors connected to a virtual world, and inside that world, a virtual camera matches any movements made in the real world. Director of photography Caleb Deschanel (pictured with Favreau above) and his crew, including a focus puller and a dolly grip, perform physical actions in the volume that translate into camera moves in the virtual world.
The crane and Steadicam rig aren't tethered to anything, so they have the freedom to be moved around within the volume as needed. But there are fifteen feet of dolly tracks (metal tubes on which a wheel-mounted camera can roll, giving the effect of a smooth tracking shot) that are physically attached to the floor. AJ Scuitto, a virtual production producer who works for a visual effects company called Magnopus, explained how it works:
"That fifteen feet of encoded data represents however long the dolly track in the shot is. It can be a fifteen foot dolly track in VR, and therefore one foot equals one foot. It can be a hundred and fifty feet in VR that curves over a mountain and has a loop in it, and one foot would equal ten feet in the virtual space. So as the dolly grip is pushing the dolly, he sees a user interface on his screen that shows him marks. So he can have marks on an iPad that say, 'I want to be here by this time,' so he can hit his marks [for each shot]."
Ben Grossmann, the movie's virtual production supervisor (and Oscar-winner for his VFX work on Martin Scorsese's Hugo), condensed the whole filmmaking process into very simple terms:
"If you go back to Avatar, Avatar solved the problem of how do you film a movie that usually gets created with computer graphics in a computer? And so we put computer graphics into the cinematographer's monitor so that they could use more traditional equipment to see the movie. Fast forward to Lion King, and what we're doing is, we're putting the filmmakers inside the monitor. So now, they can put on a VR headset and be in Africa or on the Empire State Building or on the surface of the moon, so that they can walk around and see and feel the filmmaking process with all the equipment as though they were there."
The virtual sets, which are digitally constructed using a video game engine called Unity, are all built to be explored. "We build kilometers out" for each set, Silver told us. "But you can scale the detail that's available in any one moment to what you need. So it's dynamic. And if you're filming a character going after the bugs in the grass – which happens, right? – you don't need what's a mile out. Even though it's there, technically, but we can turn it on and off." Since processing the sheer amount of data to accurately represent the film's entire map in each shot would be practically impossible, the filmmakers load the area needed for a specific shot and then create a digital cyclorama that serves as a representation of what's beyond – like a digital matte painting that extends the scene into the distance. Once the shots are approved in this form and sent off to the visual effects vendor, they have far more processing power available to them and can then insert the necessary details into the background of each specific shot.
Production designer James Chinlund told us that he went to great lengths to make sure this digital world felt rooted in reality, and a big part of that involved making sure the geography was consistent throughout the movie, regardless of where the characters were.
"My ambition was to build a world that was entirely cohesive so that at any given moment, the audience is gonna feel like they know where they are," he explained. "They're in true geography. When you're at Rafiki's tree, you can see Pride Rock in the distance. You know that that's gonna be a true relationship that you can count on. So basically we built the world map to start, so we place the cloud forest and the elephant graveyard and all these things on the map and started to build a piece of topography that would contain them, and then just up res-ed and up res-ed that world as we went...We built several hundred square miles of scenery for this movie, and I'm proud to say wherever you go in the map, there's something to see, and it all relates to itself. If you flew up when you're in the canyon, you'll be able to see Pride Rock and Mount Kenya and those relationships stay true throughout the film."
The Benefits (and Drawbacks) of Working in Virtual Reality
Executive producer Tom Peitzman explained that one of the main benefits of actually entering a virtual space is that problems relating to distance tend to fall away because the filmmakers can get in there and really get a sense of how far away things are. "[On a flat monitor], it's really hard to grasp how far it is in that painting, where the animal is sitting on the cliff, how far is it down to the floor? Where, if you can actually be standing on that space, you can see that distance. And I think that's one of the things that really gives you that sense of reality when you know, 'Oh, it's gonna take me – if I do an actual walk from this point to this point, that's gonna be like four minutes. That's way too long. I need that thing to be closer, because we wanna do it in a minute and a half,' or whatever."
According to producer Jeffrey Silver, being able to don the headset levels the playing field and gives the production an analog feel, even though it's totally digital. "The whole revolution of this type of filmmaking is that it's realtime visualization, so you can relate to it as a plain old analog filmmaker," he said. "You don't have to be a specialist in technology. You can be Caleb Deschanel, who shot Fly Away Home and Black Stallion, and just relate to it as a nature film." Later, visual effects supervisor Rob Legato told me that practically every shot in the movie is collaboratively scouted in virtual reality by a combination of him, Favreau, Deschanel, and a few other key players like the animation supervisor or MPC's visual effects supervisor.
Lighting a project like this provides its own set of challenges and opportunities. The freedom to control every aspect of the environment to an infinitesimal degree means that for every shot, the filmmakers need to choose where the sun hangs in the sky, the positioning of every tree and bush and blade of grass, and how to deal with the shadows that result from those interactions. Deschanel told us that if he doesn't like the way a shadow falls in a shot, he'll have the team digitally replace the tops of the trees to provide a shadow that's more appropriate for the mood of the scene. He's an old-school filmmaker who's essentially been learning these new techniques on the fly, and that steep learning curve means they know more now than they did when filming first began.
"I wish we could finish the movie, then start all over with all the tools we discovered along the way, because I look back at some of the early stuff – when we did the canyon chase with the wildebeests early on and we were filming and I was struggling with getting light where I wanted it. And then weeks later I discovered, 'Well, if you don't like the light there, we can take these mountains, we can just drop them down and get the sun coming through where you want it.'"
As for the downsides, one of the only negative things we heard about the process came from Deschanel, who seemed to genuinely enjoy stepping into this unfamiliar world in every regard except one:
"For me, the more there is in front of me that looks like what I wanna photograph, the better it is. And oftentimes that means that there's a lot of memory that's needed by the computer. And sometimes you'll max out the computer and they'll say, 'Well, we gotta take those trees out,' or, 'We gotta take this distant hillside away.' And then for me, if you do that, it eliminates one of those elements of composition and what I wanna see in frame. So I then have to imagine what things are like in terms of how I frame it and how wide the shot is. There's lot of second convolution thinking that goes on to make decisions."
A “Holy S***” Reaction
While we were on the set, we watched Favreau and his team shoot some of the "Can You Feel The Love Tonight" sequence, which featured Simba (Donald Glover) and Nala (Beyonce) walking side by side and singing. But because we were getting such an early look at the process, the animatic we saw was far from fully rendered. We got a sense of the timing of the shots and could see how Simba and Nala were being framed, but, strangely, it felt more like watching a team play a computer game than watching a group of filmmakers making a movie.
But when we visited the screening room a few minutes later, we got a look at a piece of test footage that made my jaw drop. It was a simple shot of Rafiki, the wise baboon who advises Simba, just sitting and looking at the camera. He wasn't doing anything exciting – he blinked, breathed, maybe brushed a fly away – but it looked so real that it was indistinguishable from an episode of Planet Earth or something you'd see on the National Geographic channel. Rob Legato told us that when they saw that, they knew they had something special on their hands.
"We did stuff that we were really proud of on Jungle Book. And then we saw that and it's literally, 'Holy s***.' And then you start bringing everybody in the room. 'You've got to see this thing.' And so that's what we've been doing. As soon as we saw it, it wasn't like, 'Yeah, that's an interesting test.' You might really be like, 'Holy fuck!' Because it's astounding that it got to that level from just a couple of years of technology that we did on Jungle Book.
"And I can't take any of that credit for it. It's the geniuses who write all the software and the fur combing tools and all the stuff that I would never even think to even ask about. All the grooming and all that stuff. And every little hair that has a different sort of tensile strength. It's unbelievable. And then Andy's animation, and they've gotten used to now taking a piece of documentary footage or real footage, emulating it to the point where it's almost impossible to tell the difference, and they build the model so well. They've learned a lot of lessons from Jungle Book. It was a 'holy s***' moment, it really was. And we've said, if we could make a whole movie that looks like that, we think Disney might be really happy with that."
How The Actors Get Involved
You've probably seen footage of actors standing in a sound booth and recording dialogue for animated films. It's common practice to film those recording sessions to provide the animators with reference so they can incorporate the actor's personality into the performance – think about Robin Williams as the Genie in Aladdin. But Favreau didn't want the actors to be isolated, so he created a space on the volume with reference cameras where the actors could read their lines together, which allowed his team to capture little details they wouldn't get in a typical solo recording session. For him, capturing the physicality was important: timing for when actors blink, how they turn to each other, how far away they would stand away from each other, even how the actual volume of their voices would change based on the distance between performers speaking to one another.
Animation supervisor Andy Jones explained why having that physical interaction is such an important part of their process:
"An animated animal is more anthropomorphic, and [the original film gave] them a lot more emotion. We're trying to infer a lot of that emotion through what the animals can really do instead of try to force it. So it's a big challenge, animation-wise, to get the performances right and keep it subtle and keep the audience still as engaged as you were in the original film. That's probably our biggest challenge.
"We do have a black box where we're able to bring some of the actors together where they have a scene together and actually have them be off-book a little, and get some eye contact patterns and certain things that we can use for their performance of the animals to really help...It's always better when you're doing something so subtle and so detailed with performance that you're getting really natural performance timings that are consistent, even though you're giving the same character to multiple animators. You're getting consistent performance because they're looking at the same thing."
This movie has an incredibly talented cast, but while this group of actors will technically be playing the same characters who appeared in the 1994 animated classic, these actors have a whole different series of challenges to overcome. Most notably, they don't have the benefit of hand-drawn animation that anthropomorphizes their characters. It's easier to accept a lion, a meerkat, and a warthog joking around and hanging out in animated form than when they look totally realistic, as they will here. So how will this version of Timon (Billy Eichner) and Pumbaa (Seth Rogen) be able to pull off the comedy needed from those characters? Andy Jones has faith:
"Comedy's always hard, and especially if you don't have anthropomorphic performance, it's even harder. But Jon has a really good taste for it and a lot of our comedy is coming from charm. We did that with Jungle Book as well. You try to make the animals charming and it kind of makes you laugh and I think that's what we're doing with Seth and Billy. We did have good black box sessions with them where they really started to improvise a lot and created some interesting takes that the writers didn't think about, and we can try and use some of those pieces in the film...I think Seth embodies Pumbaa really well (laughs). In terms of the character, his voice, the tone, and how he performs it. There's a certain level of charm and kind of innocence to his performance that actually is working really well for Pumbaa. And likewise with Eichner and Timon. He's got this kind of sarcastic approach that's working really well, too. And I think those two characters are really fun in the movie."
The Technology Used On Set
While some of the technology used to make The Lion King existed before, the filmmakers explained that the reason they're able to make the film at all is because consumer products have improved so drastically over the past few years.
"We're using the HTC Vive a great deal," producer Jeffrey Silver said. "We use the Oculus Rift to some extent. Consumer VR has gotten so sophisticated that a year ago [in late 2016], this really wouldn't have been possible. By the way, we started this a year ago projecting that it would be possible. And we're getting better at it every day. This has been a 'laying the track before the train' kind of a process. So we're on the bleeding edge of technology in this respect. But it's thanks to the development of all these consumer-facing products that we've been able to make a professional application out of it. But it's a daily struggle."
Ben Grossmann described how The Lion King is breaking new ground when it comes to the connectivity of its technology:
"VR, the real-time game engine component to it, little bits and proofs of concepts had existed before, but the software that you see on the stage was pretty much written from scratch. And all of the connective tissue that bridges the visual effects department with the animation department with the lighting team, all of that stuff is the first movie it's been done on."
And finally, AJ Sciutto laid out how this film's tech is different than that of any movie made until now:
"Virtual production systems have been gaining prowess and becoming better and better. Spielberg's Ready Player One. Jungle Book. Stuff like that. Each version is getting better, and this is leaps and bounds better than anything that's ever been done before. The ability to move stuff and be in the world and seeing it in 360...your brain kind of understands stuff through a lens, but once you spatialize yourself in the world is when you really start to understand framing and composition. That's really what's allowing this to be so much more special than anything that's ever been done before."
The Lion King is in theaters now.