We’re now more than 75 years removed from the attack on Pearl Harbor, and the ensuing Battle for the Pacific, events that defined the American experience of the Second World War and continue to live on in the public imagination. The upcoming film Midway, directed by Roland Emmerich and featuring a star-studded cast, revisits the period, giving new audiences a fresh glimpse into the drama of the war years.
There’s a special responsibility attached to any historical project, since audiences will all have preconceived ideas about what things should look, feel and sound like. That is doubly true for any World War II narrative, since those years have spawned countless movies, television and documentary projects. The need for accuracy is what brought VFX company Pixomondo to the project. We had the pleasure of speaking with Pixomondo Toronto’s VFX Supervisor, Phil Jones, and CG Supervisor Evgeny Berbasov, about the difficulties involved in bringing the Pacific Theater to life, and how a major transition in their rendering software, from V-Ray to Arnold, impacted their work.
Simulating explosions on water
The Pixomondo team worked on roughly half the film’s shots, largely focusing on the events after the attack on Pearl Harbor, with the largest sequence being a dive-bombing attack on the Japanese aircraft carrier Akagi.
Phil Jones: We worked on that whole sequence, of an American Captain and his squadron of planes attacking the Akagi itself, and eventually hit it. The sequence involved multiple complexities within each shot. Two of the main ones, for example, involve the dive-bombing itself, and a close zoom on the planes releasing their bombs, and then the explosions themselves as the bombs finally strike the carrier.
Evgeny Berbasov: It was explosion after explosion, in slow motion, with a cool roar effect, and a lot of added complexities: bubble metrics, water, solid pieces of the plane or ship, crowds, tracers. A lot of assets visible at the same time; we had accumulated everything we had in one shot.
Making a real 3D ocean realistic with CG
One of the main challenges of the work was the condensed timeline. Pre-visualizing finished in October of 2018 and Pixomondo had to deliver the final shots in September of 2019, giving them just under a year to develop and refine their assets. Less, in fact, because by March of 2019, the director wanted to make some adjustments to an entire sequence.
Phil Jones: For us, it was a lot of different shot changes and additions - going into the cockpits more, adding a personal level to the battle sequences. Camera angles changed, and new animations were added, so those were adjustments we had to make on our end. A big part of our work became choreographing flak explosions, which are basically big, black clouds of shrapnel. We had to play around with that a lot to get it right: adding more smoke here, removing some there, and playing with the tracer fire. We had done some similar work on the Star Trek television series, but obviously with a completely different look.
Interestingly enough, one of their biggest challenges involved perfecting the ocean visuals.
Phil Jones: Getting the look of the ocean right was a challenge. Our shots would start at 20,000 feet in the air, looking straight down into the ocean, and the thing is, there aren’t many reference shots for that, of that straight-down angle, because the ocean is pretty boring when you look straight down on it. As our plane (and therefore our camera) got closer and closer to the ocean, it started to look more and more like you’d expect an ocean to look, but when you’re super high up, panning 180 degrees to either side, it just looks like a blue blob. The sky, at least, has clouds and the sun, but the ocean from that distance looks surreal. We actually had one of our supervisors in Los Angeles take a flight in a small plane over the ocean to take some reference shots for us, and in the email he sent us with the photos, he wrote, “Here are the photos, but it looks like CG.”
That’s a lesson they took to another aspect of the work, the creation of “flak,” which most modern viewers will have never seen or heard of. Flak is the byproduct of anti-aircraft artillery, which would explode in the air, creating dangerous dark clouds of debris fragments. If the anti-aircraft shell didn’t score a direct hit, the flak would sometimes be enough to damage the engine or fuselage, or even kill the pilot. If nothing else, it made flying by night much more difficult and dangerous.
Evgeny Berbasov: The real sometimes doesn’t look good, actually. We ran into something similar in one of our major flak sequences. We created what we expected to see, based on reference material, but the director wanted more. “No, I want hundreds of them, all over the place. The whole sky is explosions.” So, we had to think hard on how to approach that. We couldn’t simulate every shot; that would be super complex. Ultimately, we created the asset and played it as a character, so it would do those things along the slide.
Phil Jones: The animators were actually composing explosions in proxy mode, as they were animating the airplanes!
Switching to Arnold for rendering
Pixomondo also went through an interesting technical transition, switching their rendering software from V-Ray to Arnold, company-wide, and both Phil and Evgeny shared their insights into that change. The first major benefit of the switch, for a company with offices all over the world, was compatibility.
Phil Jones: At Pixomondo, we have a global pipeline, so our transitioning to a single rendering software enables us to share assets between facilities, streamline our look, and develop custom extras. That’s a big benefit when you’re constantly working under a deadline.
But why Arnold?
Evgeny Berbasov: Integration is a major benefit to Arnold. It’s majorly supported by products like Maya and Houdini, so it can act as a bridge between our teams. If one team creates a little dove in Houdini, it would look exactly the same in Maya. Arnold can also export all of the assets in the stand-in, since it’s part of Maya; we just load it up in Maya and render it really quickly. And finally, since Arnold comes with Maya for free, it’s easy to find students or people just starting out who are familiar with the software. Arnold is also relatively fast, so we were able to render tons of geo and build metrics in one scene. Some scenes were upwards of 200 million polygons, so for Arnold to be able to transfer such large amounts of data was huge. Our effects data at the end of the day was about 100 terabytes.
Phil Jones: It was more than that. We were getting yelled at constantly because all of our caches were taking up – I think, in the end, it was like 150 terabytes of data. Just for effects cache.
Evgeny Berbasov: It’s a lot to manage. I was actually surprised that Arnold and Maya could handle it. It made us more agile.
Phil Jones: As for the transition itself, it was a learning experience. Our veteran lighters tend to pick up new renderers very quickly, then our seniors are passing on the knowledge they are learning or discovering themselves. Not everybody has experience with every single renderer, but when you have enough experience in even just one renderer, it’s pretty easy to move to another. There are exceptions, though. The shading networks are slightly different in V-Ray than in Arnold, so there was a lot of learning to do the same thing in two different places.
Evgeny Berbasov: Arnold is more for artistic driven people. You are actually making the picture, and the scene is actually calculating the final result. There aren’t that many knobs you have to turn to get the result you want. Arnold is very approachable in that way.
Midway opens in theaters nationwide on November 8th, so audiences will have a chance to appreciate first-hand all of Pixomondo’s hard work.
Pixomondo’s Los Angeles location was the lead facility on the project. The studio’s Toronto, Vancouver, and Stuttgart facilities were also involved in bringing Midway to life. Pixomondo was a co-producer on the film as well.