Barry Williams, a creative special effects expert with nearly two decades of visual storytelling experience, is one of the virtual production technology gurus at the soon to open Prysm Stage at Trilith Studios near Atlanta.
At Trilith, the second largest studio complex in North America, (just behind Warner Brothers in LA), Williams oversees the pipeline, development and maintenance of new and proprietary technologies. The Prysm stage, a new joint venture with the NEP Group and Trilith Studios, is one of the largest virtual production studios in the world. It is among 24 production stages and an extensive 400-acre backlot at the 700-acre Trilith complex near the Atlanta airport.
Featuring an enclosed 80 x 90 x 29-foot virtual production set in an 18,000 square foot sound stage, the studio’s footprint is built to accommodate large set pieces. It is wrapped 360-degrees with motorized ROE Visual LED panels — including an overhead LED ceiling. The studio is equipped to deploy game-engine-driven video playback designed to immerse filmmakers in large-scale, real-time digital environments.
Broadcast Beat’s Frank Beacham interviewed Williams about virtual production and the new studio:
Frank Beacham: Barry, a lot of people are confused about exactly what virtual production means. Could you explain it layman’s language?
Barry Williams: There are typically three stages of production. There’s pre-production, which is your script, getting the actors together, art direction, and so forth. Then you have production — the actual shoot. And, finally, you have post-production.
There are a couple parts of the production phase that are virtual. One is motion capture. This is when you see actors with little graphic balls on them for tracking their movements (like in games or animation). That’s one form of virtual production. But the hottest version right now is having the entire set displayed in real time on LED walls behind the performers.
Frank Beacham: What are in-camera visual effects, or ICVFX?
Barry Williams: This involves the LED walls. We’ve all seen a large green screen with actors performing in front of it. After the production, a visual effects house replaces the green screen with a digital extension or an animated character.
In-camera video effects refers to when the set background is displayed real time on the LED wall. That projected set is actually being shot live with the actors. From the production, it goes directly to editorial — bypassing any post-production process.
For example, let’s say you have a rooftop scene. You build a small set of a rooftop and the rest of the extension into the greater city is on the LED wall that’s surrounding the actors. All that goes into the camera — just as the director sees it. This is a huge win for everyone because the final scene can be viewed as it is being shot.
This is why LED walls are so useful. They allow directors to accurately see the finished scene while they’re shooting. No one on the production team has to wait months for the visual effects to be completed.
Frank Beacham: There is software between the LED screens and camera. What does it do?
Barry Williams: There are two different ways that the LED wall can be used. One is for playback — say when you see a car driving or a plane flying and the background is projected from behind. Those background scenes are not interacting with the camera. In that case, they are playing video backgrounds onto the wall.
The other way a wall is used is to have the background imagery responding to the moves of the camera. Lux Machina, the company that operates our virtual reality studio, uses Unreal Engine, which was first used for making video games. Now it has moved to movies.
Imagine having a video game and being able to move around anywhere within the game’s world. You can now move any element around on the LED screens and have the camera track them accurately. When the camera moves, the perspective changes on the background. It not like process work. It’s fully interactive, and you can play and manipulate it as you’re shooting.
Frank Beacham: Is the Prysm virtual stage complete and ready to operate?
Barry Williams: We’re finishing it up now. It’s all brand new and we’ll be allowing clients to actually book the stage this summer.
Frank Beacham: Trilith Studios says you have the most experienced supervisors, engineers and technology specialists in 3D, 2D and 2.5D visual effects in the business. I’m curious — where do you get people that know about this technology? How are they trained?
Barry Williams: They come from all types of backgrounds. Some did the background screens for really large music concerts, which has been going on for decades. Some come out of visual effects, others from real time content — like gaming.
Frank Beacham: Sounds like it’s more of an apprenticeship kind of background rather than formal training…
Barry Williams: Yes, it is. People tend to have an adjacent skill set. You can’t find people now with formal training from the education system. It’s not yet taught at schools, but that’s changing. Some schools are building small LED stages to start training people. So that will change for sure. But for now, information is coming from people who’ve been doing this for a while. They are grooming the next generation, who just have some skill set that is somewhat applicable?
Frank Beacham: Final question…What other new technologies are you using at Trilith Studios?
Barry Williams: In addition to the Prysm stage, we are working with augmented and virtual reality. Virtual scouting, where a director can scout a location without being there, is catching on. It’s all really at the tip of the spear. These new processes are being refined daily.
Everyone here is on an adventure together with the birth of very new technology. Anyone that claims they’ve got all this figured is out is not telling the truth. This is brand new territory and we are all learning the best ways to use it.
Frank Beacham: Thank you, Barry, and good luck.