DIT Eduardo Eguia details the imaging process used on The Mandalorian

The Mandalorian is about the travels of a lone bounty hunter in the outer reaches of the galaxy, far from the authority of the New Republic. It is the backstory vision of George Lucas’, to the Star Wars bounty hunter theme that tied together Boba Fett and Hans Solo in the epic tale of Episode V – The Empire Strikes Back. The vision was brought to life by series creator and Executive Producer Jon Favreau.

With season 2 production just wrapping in early February and the post-production currently underway, with automated VFX pulls being delivered by FotoKem, and the final finishing being done at Company 3, Disney+ hopes to continue the momentum of this premier franchise that helped launch their streaming service last year.

CODEX caught up with Digital Imaging Technician, Eduardo Eguia, who has been in the role of DIT since season 1, working alongside Directors of Photography, Greig Fraser, ASC, ACS, Baz Idoine, and joining on season 2, Matthew Jensen, ASC. As showrunner, Jon Favreau is not only continuing to tell the stories of adventures in galaxies far, far away, but he is also continuing to push the boundaries of how a production visualizes the final concept while still on set. Extended the pioneering work done in VR on The Jungle Book and The Lion King, Favreau and ILM Visual Effects Supervisor, Richard Bluff, used the Unreal game engine technology to create incredible virtual set extensions.

Production for season 1 and 2 was primarily based at MBS Studios in Manhattan Beach, California, on multiple stages, with some external shots captured on location using a built-up backlot. With ample stage room at MBS Studios, the post-production teams were set-up near set in order to quickly generate the dailies for both picture and visual effects editorial.

On season 1, the near-set team was led by Pinewood Post in support of Greig Fraser, ASC, ACS, with Scott Fox as the dailies colorist. In season 2, the dailies post and visual effects pulls responsibilities were turned over to FotoKem, who used their proven nextLab system near-set, to back-up and encode the ALEXA LF ARRIRAW camera native files, and to generate the editorial deliverables. FotoKem also engineered and set-up a system to automate the visual effects pulls from the encoded ARRIRAW files and generate the OpenEXR plates for ILM and the other visual effects vendors.

In speaking with Eduardo about the data migration process and color pipeline, CODEX wanted to learn about his DIT cart set-up, and the configuration of software tools used to manage the two seasons efficiently. “My DIT cart can process up to 6 cameras simultaneously, applying custom real-time color grading to each individual camera. I used Pomfort’s Livegrade Pro with an ACES workflow to achieve this process. From my cart I sent a color corrected REC709 signal to the DP’s monitors, and to the Video Assist to distribute the color corrected signal to the rest of set. In terms of looks, our starting point was a look up table (LUT), created by Greig, based on a film stock. From there we did some adjustments to achieve his vision for the show, and we generated a base LUT. Whenever we needed to make changes, I created individual color decision on top of this LUT, or CDLs, to achieve the desired look. These CDLs were sent to Scott Fox, the dailies colorist, to apply and balance the grades. The same process has continued with Baz and Matthew on season 2, except FotoKem applied the CDLs, and we use a show LUT from Company 3, who are providing the finishing services for season 2.”

In setting up the workflow for season 2, FotoKem employed CODEX’s High Density Encoding (HDE) process. HDE provides a bit exact, lossless copy of the original ARRIRAW native files but at 50-60% of the original size. This allowed dailies colorist, Jon Rocke at FotoKem to store more of the ARRIRAW files near-set, and to deliver quicker turn overs using their automated visual effects pull system. “FotoKem could deliver the turn-overs quicker with the nextLab system, since the files were effectively smaller by 2:1 in data footprint”. The HDE files were then supported through the entire post-production pipeline including final color with Charles Bunnag, a finishing colorist at Company 3.

“I have to say that seeing the difference in media processing between seasons 1 and 2, when HDE became widely available was fantastic,” says Eduardo. “HDE helped the workflow tremendously, not only to speed up the process of managing the media but by improving the turnaround time to get the CODEX capture drives back to the Camera department for reuse.”

This reduced the storage required to backup up the media that was shot the previous day. “Before season 2 started, we discussed the value of deploying HDE with James Blevins, the post-production supervisor. We knew that utilizing HDE would be a very important workflow improvement for production and post-production. It proved to be the right decision.”

The visual effects supervisor for the production was Richard Bluff. The Visual Effects design was commandeered by Industrial Light and Magic (ILM), the academy award-winning visual effects company founded by George Lucas. The VFX were guided across the series by the supervision team of VFX Supervisors; James Porter, Hayden Jones, John Knoll, Alex Prichard, Steve Moncur, and Jose Burgos.

Richard Bluff also worked with various other visual effects houses to deliver on the enormous number of VFX shots in the production, such as Base FX, Image Engine, Important Looking Pirates, Ghost VFX, Hybride, MPC and Pixomondo.

The production made use of new virtual set extension technology pioneered by ILM. The idea of rear screen compositing has been in use in filmmaking since the silent era and has been deployed in numerous ways for window replacements of background in cars and train scenes quite routinely.

What wasn’t routine for The Mandalorian was the sheer scale of the virtual set, formally called Stagecraft. The green screen volume created for the show was 20 feet tall, 270 degrees around, and 75 feet across — the largest and most sophisticated virtual filmmaking environment yet made and used in production.

The Stagecraft background is a set of enormous LEDs. The innovation driving the production use of Stagecraft, and other like designs, is the advancement in smaller pitch on the LED panels between the actual LED elements. (see accompanying article). This allows the images projected by the LED walls to look more photo-realistic because they are not static. Not only is the image shown on the LED walls played back in real-time by powerful GPUs, but that 3D scene is directly affected by the movements and settings of the camera. If the camera moves to the right, the image alters just as if it were a real scene.
This is where the Epic Unreal game engine came into play using an array of powerful PC/GPU that were set-up and controlled by a team of technicians, commonly referred to on set as “The Brain Bar”.

“The virtual backgrounds were astonishing, and a game changer on set for our industry. There was an initial learning curve to it, but the results speak for themselves and look amazing,” adds Eduardo.

“The Brain Bar team was in constant communication with the DP’s, providing for a perfect integration of the virtual scenes with foreground real elements. It was also important for the ongoing coordination between the DP, the Brain Bar, the Gaffer and me, not only to achieve the best results, but for the steady blending of the looks, as any color adjustments by anybody on either side affected the image.”

Eduardo had a professional 4K OLED Monitor setup directly next to the DP monitors, so they could watch in high resolution the signal coming out of the cameras while able to apply a matching color grade to the individual cameras. This allowed the team to see with total detail the live blend of the real foreground with the virtual background.

Greig Fraser, a fan of ARRI Cameras and Panavision lenses initially chose the ARRI ALEXA LF (large format) camera system for production. This camera choice continued into season 2 with cinematographer Baz Idoine, and Matthew Jensen, ASC, both using the new ALEXA Mini LF, 4.5K camera from ARRI, into the mix.
For Eduardo and the digital imaging process he deployed on this show, “The visual ‘volume’ is what has made the biggest difference to anything I’ve worked on in the past. One day, you could be working in space, or be in the desert, the woods, a tunnel. All these set-ups were on same stage and sometimes on the same day.” Despite the visual advances, it still took a lot of talented people to make it look real, from the incredible art department visuals, to the skilled animators who brought the characters to life, and to the special effects teams, and the Brain Bar team. We learned a lot on season 1, but Grieg and Baz really understood the technology and embraced it from the very beginning, and they pushed the technology to its limits to achieve some amazing results. On season 2, ‘The Volume’ was bigger, and the results were even more impressive. I can’t wait to see where the future takes us!”

Broadcast Beat - Production Industry Resource