In War for the Planet of the Apes, the third film in Fox’s reboot of the 1968 franchise, the apes are once again the emotional and narrative center. Caesar, played by Andy Serkis in a motion-capture performance, leads his tribe against the humans in a war for both survival and for the planet. Like with 2011’s Rise of the Planet of the Apes and 2014’s Dawn of the Planet of the Apes, the process of using motion capture on real actors to create the primate characters is essential to ensuring that the apes look and feel realistic. Unlike in other VFX-driven flicks, the Apes franchise emphasizes that every character comes from an actual live performance on set.

Visual Effects Supervisor Dan Lemmon has worked on all three films with his team at Weta Digital in New Zealand and knows the process intimately. He explained how the apes are made, the challenges involved and what technological strides were made between the 2014 movie and this one.


Where do you start when it comes to visualizing the apes?
In terms of the creative process, it all starts where all movies start, which is the script and the casting. Part of what’s unique about our approach for creating these digital apes is that we try to make it as much like the traditional filmmaking process as possible. We cast actors for each character and the director works with them on the set. They rehearse, try things out, shoot multiple takes. The actors playing apes wear these funny pajamas with dots all over them and our job is to change their appearance so rather than looking like Andy Serkis or Steve Zahn, they look like Caesar or Bad Ape. It goes back to the original Planet of the Apes movies, where, in 1968, they’d sit put all this makeup and prosthetics on the actors and go to great lengths to make them appear like an ape. Our process is pretty similar in that we’re still using actors to drive the performance. The technical aspects are just different.

How important are the actors’ performances to create the apes?
It’s super important. The thing about our approach is we’re all about trying to allow the director and actors to work together on location and react to each other. We want them to be the same together and explore the scene together – in the same way you would do in a normal live-action movie. The more traditional VFX, where it’s been done on some movies in the past, there’s a tennis ball on a stick and you have to pretend it’s Caesar the ape. It’s a totally false situation where the actor doesn’t have a lot to react to or play off. So if you can get actors playing all the characters you can get a better response and a better performance out of everyone. I think there’s a place for both kinds of approaches, but certainly for these movies, where we’re trying to put these characters into the real world and get the audience to buy into the idea that they’re living, breathing characters, it’s helpful to have all the actors there at the same time.

So every ape in the film is played by an actor, even in the big crowd scenes?
Those are all actors. What we do for big background scenes like that is that we have the primary characters there on set on the day and then we fill in a lot of the background later. We still use performers who are directed, but that’s separate on a dedicated performance-capture soundstage.

Do you train the actors on the process beforehand?
Not really. That’s part of the beauty of the process. They have to get used to the idea of wearing this funny suit and all this equipment strapped on them. But if they can ignore that and play their character and not worry about the technology that’s best. It’s no different than any other character in any other movies. There’s a physicality to being an ape, but in the same way that if someone was playing an athlete in another movie.

Is there an aspect of the VFX process that has gotten easier over the three films?
We’ve made lots of improvements each time we’ve come back to the franchise. On the first film there were a lot of unknowns. Would we be able to pull off this ape character and put him alongside these humans? Would the audience empathize with him? We didn’t want it to look like Roger Rabbit, where there’s a real world with cartoon characters in it. We want people to buy into it. That these characters all live in the same space. We went to a lot of effort on that first film to make sure it was working. And every time we’ve come back to it our technology has improved. We’ve gotten better at dealing with the fur and making the lighting on it look more realistic. This film is a considerable leap forward.

How long is the process from being on set with the actors to having a finished digital effects ape?
It really depends. I’ve been working on the movie for about two and a half years. A lot of that process is preparation and the logistics of shooting the thing. After shooting Matt Reeves, the director, takes all the footage from the set and cuts a movie together. That version of the movie has Andy Serkis and Steve Zahn and everyone in the grey suits. If it works emotionally in that version, then when we go through the process of making them look like apes, then as long as we do our jobs and the apes connect emotionally in the same way the actors, do the movie as a whole will totally work. It only gets better as the realism and the suspension of disbelief is enhanced. That puts the onus on us to make sure our characters perform to the same level and hit all the same beats and have all the same subtlety as the actors on the set.

How many people are involved in the VFX process of creating the apes?
A lot! In a big scene on set we’ll have a team of about 30 people. They’re involved in capturing the data while shooting. But once we get back to New Zealand and the movie’s been cut and we have all the shots and we start actually working on it we’ll have up to 1,000 people who touch the film. There’s probably 800 or so who are actively working on shots and making the monkeys. It’s a lot of people.

How challenging is it to capture that data on set?
It’s a challenge. When we work in our dedicated space in New Zealand it’s a big space and we do nothing but motion capture on it. If you’re setting it up from scratch it can take up to six weeks to get it all set up and calibrated and have everything fully humming. That requires a bank of 10 to 12 computers. When we’re on set we have to bring that and set it up in 10 to 20 minutes. We had to figure out how to take that volume that takes six weeks to set up and make it portable and light-weight and fast enough to get working on a shot that has to happen quickly. That’s a big logistical challenge.

Was any new technology created on this film that could be used in future films?
The process of invention never really stops for us. We’re always looking for ways we can push technology and we build on what we’ve developed in the past. On this movie we had a lot of furry creatures and characters who were pretty shaggy. We needed them to look believable as possible. There was technology we changed at a microscopic level to make sure that our digital fur works like real fur. It’s ongoing, but we really turned a corner on this one in the realism of that process. The snow was a big thing for us. We hadn’t done snow quite to this level before and our characters had to roll around in it, which was pretty difficult. The fur needed to pick up the snow and then shake off. Ideally, we develop it and then we build on it as we have other snowy movies later.

Was this film additionally difficult because there are so few actual human characters?
It was. The last movie had about 900 ape shots. This movie had about 1,400 ape shots. Because the story was told from the point of view of Caesar and we never really left him, there were apes in almost every shot of the movie. There were very few human-only shots. So we were involved in the entire two-hour-plus runtime of the film. We want the audience to connect to them as if they are human.