The leading VFX and animation studio partnered with Dimension Studio on an end-to-end virtual production and visual effects effort on more than 1,600 shots and 200 assets for Apple TV+’s harrowing miniseries that follows the exploits of the 100th Bomb Group flying daytime missions over Nazi Germany during the darkest days of World War II.
When Steven Spielberg and Tom Hanks collaborated as director and star on multi-Oscar-winning Saving Private Ryan, a creative partnership was forged out of their mutual fascination with World War II that eventually saw them executive produce Band of Brothers and The Pacific for HBO. They’ve returned with a bird’s eye view of some of humanity’s darkest days, on behalf of Apple TV+, with Masters of the Air, which follows the daytime bombing mission exploits of the 100th Bomb Group, a unit of the U.S.’ Eighth Air Force.
Taking charge in adapting Donald Miller’s book into a dramatic miniseries were John Shiban and John Orloff, who turned to the VFX team of supervisor Stephen Rosenbaum, producer Bruce Franklin, and additional on-set supervisor Adam Rowland to digitally provide the epic scope, scale, and visceral peril and devastation associated with that conflict’s aerial warfare. A battalion of vendors was recruited that DNEG, which in turn partnered with Dimension Studio to established DNEG Virtual Production; the end-to-end virtual production service was responsible for over 1,600 shots and 200 assets that are featured in 32 sequences across eight of the nine episodes.
Here's the final trailer:
Before Game of Thrones, Band of Brothers set the bar for high-end TV visual effects, a legacy that had to be maintained. “I remember watching Band of Brothers over and over again,” states Xavier Bernasconi, VFX Supervisor, DNEG. “Over 20 years later, having the opportunity to join the show was frightening and exciting at the same time. It was incredible, but with a lot of expectations and responsibilities for doing justice on such amazing previous shows as well as with the story that we were telling.” The obsession with World War II was not only confined to the creators of Masters of the Air. “It’s one of the most interesting parts of the show,” notes Steve Jelley, Co-CEO, Dimension Studio. “Looking at the documentary The Cold Blue, which has the only available footage that was shot up with B-17s and the incredible amount of detail that went into trying to ensure that this show represented the history of this incredible period, shows in the final result. The attention to detail of the mission books, trying to recreate the actual movements of these pilots, the way they communicated, the size of these formations, and the freezing temperatures that they were subjected to, it was a match made in heaven.”
In order to simulate the dangers and wonders that go along with flying, a virtual production methodology was adopted. “In a lot of setups that we had onset, there was this huge gimbal motion base with the B-17 cockpit on it, which helped the actors to feel like they were there in that moment,” remarks Bernasconi. “That’s where the virtual production really came through, such as in Episode 109 when they’re flying towards flak explosions at 500 miles per hour. The actors had to look left and right, and jostle because the entire motion base was moving. With a greenscreen, I don’t think that they would have been able to experience it in such a way.”
With the absence of buildings, clouds became landmarks in the sky that oriented the viewer as to where the action was unfolding. “The only way to make sense of the story was to ensure that the cloudscapes were recognizable,’ notes Jelley. “Those were built proxy in the volume for interactive lighting so you would feel the lighting on the actors’ faces and special effects would go off, showering glass in the cockpit. That same concept was then carried into the visual effects work that DNEG did around making this make sense spatially. From the actor’s perspective, they needed to understand their orientation and the storytelling of the action to such an extent the aerial sequences made sense to them. They were also surprised. We were able to do live triggers of flak popping on the screen and then shake the motion base so they would be able to feel something go off and be able to react to it physically. The combination of planning, visual reference, and then surprise and action was part of this great drama that was brought together by the director. It was incredibly complicated but so worth it because that’s what the audience sees when they feel part of these missions “
Bernasconi emphasizes the importance of cloud landscapes visually and narratively.
“A clear moment is Episode 105, when Major Robert Rosenthal [Nate Mann] does the split S and then the chandelle maneuver. It’s hard to understand what is happening. He does portside, starboard, again portside and pitch up. If you don’t have anything surrounding or recognizable you don’t know what is happening. We had some recognizable clouds without taking too much attention away from the story in the middle of the screen. That let you realize that the clouds before on his left are now on his right. Before they were below him and now, they’re above. We gained some geographical landmarks for the audience to understand where the German fighters were coming from. We also used them to create some interesting moments. In Episode 101, you see Major Gale Cleven [Austin Butler] pitching down quickly trying to exhaust the fire on the engine and then you see in front of him these big clouds and then suddenly we cut to a sideview and behind the clouds you see the German fighter using them as a hiding device, and then they come around and attack Cleven. Stephen and I came up with that during post, and that’s one of the most creative moments I remember because you feel a part of the storytelling process.”
Having a sense of the geography for the land below was also important as it informed the viewer where the action was taking place. “That was a huge challenge because if you think about it, every single episode has a mission except Episode 106,” remarks Bernasconi. “This means we have to cover thousands of thousands of kilometers of landmarks. You need to establish some visual representation that gives the audience an understand of where we are in Europe. In Episode 105, you clearly see that we’re going over the English Channel. You see both sides of the channel so you understanding that you are passing that. Then you are over Germany with Berlin, then we show the Alps, Genoa, Corsica, and Africa. You need to find some clear visual references so the audience will be like, ‘Oh, they are there. Now it makes sense.’
Creative license had to be taken to match the expectations of the viewer to ensure the believability of the imagery,” states Bernasconi. “We have a visual image in our brain of what something should look and feel like. Then we go and look at real reference and you see that they are different. There is a discrepancy sometimes between what we want to see and what reality is. Camera shakes were complicated. There are entire scenes where there is a 3D camera shake built into the actual cameras so the entire plane would shake. Our 3D rig of the plane included wing wobble, so if you shake the plane because of flak, then the wings would wobble. If you were taking off you would see the wing at high frequency shake. That shake was one of three or four layers depending on the actual shot of shake that we would add to the shot. We would have the shake because of engine vibration, high frequency and low amplitude turbulence, air shake, which is more like wobble that is high amplitude but low frequency, and then we would add external factors such as bullet hits and flak which would usually be high amplitude, quite powerful, and high frequence as well as shake.”
“On top of that you would have to have something called a decoupling of the background and foreground shakes, meaning that our cockpit would have a shake that is slightly ahead of the environment shake,” continues Bernasconi. “This is to create a parallax between two of them. Otherwise, all that you get is the entire frame shaking left and right. But that’s not what happens in reality because the environment is further away than your cockpit. It would have a certain decoupling. That is not something which is science per se. It was by eye. And it was a lot of trial and error. Stephen was adamant of getting the right amount that would not distract the storytelling but would be interesting visually to bring you into the scene. On top of that you also have the acting of the cast. Let’s say on the motion base the special effects department shakes the cockpit, you would see the actors doing that. We had to sync our environment shake with their acting and make sure that it was in time with that movement. That was an additional layer of complexity on the entire effort.”
The potential for special effects to damage the LED screens was not a major concern. “The LED is a light source fundamentally and that’s how we were using them,” observes Jelley. “We had virtual flak and clouds going past but what is difficult to communicate to an audience or readership is the scale. We have an enormous motion base with parts of B-17s on it at various points. We had a huge wraparound screen and there is a big distance between the screens and special effects blasting glass or creating explosions. Then the virtual flak was a visual effect in the end. It’s a choreography. You can’t see the big gap between where action happens and where the screen is and since we’ve done this show we’ve had rain bars and fire in front of a LED screen, so they’re surprisingly robust.”
Masters of the Air was one of the first shows where Jelley and Bernasconi got an opportunity to do end-to-end virtual production and visual effects. “We started off with a conscious decision from the director to the DP and all of the heads of departments that the only way to achieve the lighting they wanted was to actually build the planes and have the actors on a motion base being moved through space that had interactive lighting provided by the LED volume,” remarks Jelley. “We went through the eyelines, sightlines and the action itself, and choreographed that onset. But we put all the learning and prep into creating the ‘cloud atlas,’ which is the internal name given to the types of clouds that we needed to model that Xavier then built enormous simulations of to create the landscapes in visual effects. It was a continuous creative process and that’s what is so interesting. All these elements add up to the visceral feeling that you’re up there.”