Search form

Image Engine Swings into Action with ‘Spider-Man: Far From Home’ VFX

VFX supervisor Joern Grosshans and team tackle the Stark Jet, CG drones and satellites, tulip fields and a Spidey suit-building machine in Sony Pictures Entertainment’s action adventure hit.

‘Spider-Man: Far From Home.’ ©2019 CTMG, Inc. All rights reserved. Note that the Image Engine team featured in the article did not work on this particular shot. Image Engine shots are featured below.

To help visualize the overseas adventures of famous New Yorker Peter Parker on his European vacation in Marvel’s latest Spidey franchise hit, Spider-Man: Far From Home, Image Engine worked on 230 shots featuring CG drones and satellites, the Stark Jet, tulip fields and a Spider-Man suit-building machine.  Returning as production visual effects supervisor on the second solo MCU outing for the teenage superhero was Janek Sirrs (The Matrix), who connected with Image Engine via cineSync twice a week during post-production to keep abreast of the work.  “Janek always had the whole movie in mind as we were working on the bigger chunks and getting more into detail,” Image Engine visual effects Supervisor Joern Grosshans remarks.  “He has a lot of experience working with Marvel Studios and was able to guide us through the process.” 

Looking to avoid disclosing potential spoilers associated with Avengers: Endgame, Marvel required tremendous secrecy surrounding the movie during the course of production.  “We weren’t allowed to send images or emails,” Grosshans reveals.  “The artists’ workstations had no Internet access.  It helped that we have a good IT infrastructure to make sure that every project is secure here.”

Extensive principal photography took place at various locations in Europe so the production didn’t have to rely on soundstages situated at Pinewood Atlanta Studios.  “For me, it’s easier creating CG elements if you have a shot captured on location with real lighting rather than artificial studio lighting,” Grosshans notes. “The edits were constantly changing, and you need good senior people in the crew to turnaround changes with high quality and a solid pipeline, as technology is a big factor.  Even if Marvel Studios had new shots or scenes, we could be fast to update them.” 

Image Engine was involved with conceptualizing sequences such as the Spider-Man suit build.  “Though there was a scan of the suit, the challenge was that we had to show what was under the main black and red layer as well as the mechanics that put it together,” says Grosshans, who utilized real world references for the high-tech machinery.  “The first stage is the white suit, which is made out of wet fibers. For that, we looked at some big industrial weaving machines that have a distinct pattern.  There was also a visual connection to spiderwebs with regards to how the threads come together.  In the second stage, which happens on top of the white suit, there is another layer of fabric detail going on for the red and black suit.  The last step is where the robot arms build all of the hard surfaces stuff like the eye and chest patches.”    

In the film, when the Stark Jet lands in a Netherlands’ tulip field, the scene was actually shot in the UK on a grass field.  Most of the shots are fully CG with the only real elements being Peter Parker (Tom Holland) and Happy Hogan (Jon Favreau).  “We had to simulate a whole tulip field with some wind motion, but there’s also the action of the jet, which is landing,” Grosshans explains.  “Can you imagine how many tulips and single petals there were?  There was quite a lot of R&D time involved in figuring that all out. Our own technology tools helped us to do the layout for the whole sequence.”  Rendering the heavy number of polygons in an efficient manner was not easy.  The challenge was figuring out what needed to be simulated live versus pre-cached and inserted. “We had pre-cached simulations for the backgrounds, while ambient motion and the hero interaction with the jet were simulated,” he adds. “Gaffer is a great tool that we modified inhouse for this project.” 

Tulip fields are extremely vibrant and in reality, almost look like CG.  “We did a layout to fill up the fields, to get the rows right and place real actors in a position that makes sense as you’re cutting back and forth with the dialogue scenes,” Grosshans states.  “After that you go to look development and figure out how to make it efficient.  You have to do all of the render tests.”  Eight colors of tulips were produced, including yellow, purple, white and red.  “Every color had a variety of eight different tulips that were hand modelled, textured and crafted,” Grosshans continues.  “Then we procedurally scattered them.  We had something like two million tulips.  It was math-based so if there were layout changes, we could procedurally change the width of the rows or the path size.”

For the Stark Jet, Marvel Studios provided Image Engine with concepts to work from. “We looked at real-world references for luxury private jets to see how the material and dirt works in order to bring some realism to this futuristic jet,” Grosshans says. In the third act, the action shifts to London, where Hogan and Parker’s classmates witness the Stark jet crash.  “We did a layout simulation with rough geometry to see if the scale was right and whether or not it worked within the cut,” Grosshans shares.  “After everyone was happy, we did a simulation for the jet which included two parts.  You had the hard surface stuff that breaks and a simulation for the actual explosion.” 

Some of the fire and embers following the crash were practical elements. However, the vast majority were CG in order to provide filmmakers with complete control of the timing.  “The explosion happens in front of the Tower of London,” Grosshans notes. “There is a grass area where we had to create some impacts and fires.  We did a matte painting to get the burned grass area right, and put some CG and practical fires on top to create the aftermath destruction.”

Conveying a sense of scale in outer space for their shots involves the satellite and drones was not easy for Grosshans’ team, which surveyed considerable satellite imagery that provided reference for several different designs. “Normally, for environments, you can convey scale through atmosphere,” Grosshans remarks. “But this is not possible in space. We had to create a detailed model and texture map to get the scale right.”  At the head of the satellite is a mechanism that opens up and reveals a swarm of drones.  “The swarm had to convey a feeling of intelligence rather than just flying straight out of the satellite,” he continues. “This was done using a particle simulation with some tricks to get a technology pattern. And since in outer space the sun is the sole key light, in the shadow areas we tried to get some reflection from the Earth.”

Trevor Hogg's picture

Trevor Hogg is a freelance video editor and writer best known for composing in-depth filmmaker and movie profiles for VFX Voice, Animation Magazine, and British Cinematographer.