Share

Ncam: Simple “One Size Fits All” Tracking for Live Broadcasters

Virtual production has come to live television broadcasting. From LED wall set backgrounds to automated tracking systems, broadcasters can now do what was once limited to feature films and video games.

A major player in hybrid camera tracking is Ncam, based in Los Angeles and London. Ncam’s Reality system can track using any camera and lens combination with or without reflective markers in any environment — from studios to outdoor locations. It has been used on location with broadcast cameras, Steadicams, dollies and cable-cams, and can work inside with LED sets and green screens.

In its basic form, Ncam comes in three parts: a camera bar (with sensors); an encoder to measure lens data; and a server, which processes all the camera information. The Ncam server can interface with render or graphic engines from a range of companies including Unreal, Brainstorm, Virzt, Pixotope, Assimilate, Disguise, Unity, Zero Density, Ross and others. The Ncam system can be controlled with a smartphone, tablet or personal computer via Wi-Fi or Ethernet.

Broadcast Beat’s Frank Beacham spoke with Nick Hayes, Ncam’s director of sales for North America.

Frank Beacham: Nick, for a lot of broadcasters, virtual production is very new and needs a simple explanation. Could we begin with what your company does?

Nick Hayes: Ncam makes hardware and software to essentially track a camera in a three-dimensional space. The tracking and lens data is fed into a server for interpretation by a render or graphic engine. This is done when using any type of set — whether displaying a background plate on an LED wall, green screen or inserting graphics into a live camera feed.

If there is any form of camera movement, it’s important to understand where the cameras is located in space. This way you get the appropriate scale, size and parallax when the camera moves, pans, tilts or zooms. The graphic will change appropriately to match the camera movement.

Frank Beacham: Let’s take a weatherman in front of an LED wall showing footage of a hurricane. How would the Ncam system work in this situation?

Nick Hayes: In this case, we would track the position, orientation of the camera along with the lens data to enable the use of photorealistic 3-D graphics or backgrounds. A weatherman could stand next to, behind or even “walk around” the augmented reality graphic such as a hurricane to provide an immersive experience for the viewer.

Frank Beacham: How is this done?

Nick Hayes: In our software, the user would set what we call a ‘zero point.’ That point is essentially any place in the three-dimensional space in the field of view of the camera that you want the graphic to be. Your weatherman could point to the hurricane and move around it and interact with it. Our technology is tracking the movement of the camera.

When a graphic is shown, you need to know the exact location of the camera. This data sent to the render engine which is driving the graphic. Our software is talking to the render engine.

What’s unique about our system is the hybrid tracking. In a traditional tracking environment, our competitors require the use of reflective markers. If you look up at the ceiling, you’ll see a group of little circles, which are essentially reflective stickers. The tracking technology uses an infrared sensor that shines a light at the ceiling. It’s invisible to the naked eye.

Hybrid tracking means we have the ability to track off anything that is static. It can be a lighting grid or a plant — anything that’s not moving. Our camera sensor will see different contrast points and determine depth. If you already have markers, we can use those as well. We are completely flexible.

Frank Beacham: Your devices on the camera produce the data, feeds that data to your server, which in turn interfaces with the render or graphics engine. Correct?

Nick Hayes: Exactly. There’s a variety of companies that make render and graphics engines. Epic Games’ Unreal Engine, of course, being most popular these days. But there are a lot of other companies out there building platforms on top of the Unreal Engine. Disguise and Brainstorm, for example, are two crucial technology partners with us. But we are completely agnostic and have strong relationships with many of the major players involved in virtual production.

Frank Beacham: How is your technology now being used by broadcasters?

Nick Hayes: Let’s start in the traditional broadcast studio. Say you have a table with two anchors sitting in front of a green screen or an LED wall with a virtual background. With no camera tracking, when the camera is panned or zoomed, there would be no change to the background. The realism goes away because the background does not change.

With tracking, broadcasters have the ability to create dynamic, evolving backgrounds. The background is not frozen and is much more realistic to viewers. We’re now seeing more broadcasters move anchors from behind a desk and put them in an open set.

They can walk around within the background visual. Some are using augmented reality where there are three-dimensional objects in space. As to the weatherman we spoke of earlier, he can be immersed in the hurricane rather than just standing in front of a static image.

Another example was at the Super Bowl, where an Ncam system was mounted on a Steadicam roaming around the field. The Steadicam got a shot up in the rafters of the stands with the sky in the background. It was a wide angle shot with a large statistical graphic placed over it.

Augmented reality can be done in live broadcasts. Depending on the type of broadcast, a render engine may or may not be used. A post-production facility could just use the tracking data and time code to sync footage in the edit. There are many creative ways to use tracking technology.

Frank Beacham: What broadcasters are using Ncam technology today?

Nick Hayes: We work with some biggest broadcast companies on a global basis. These including NBC, CNN, BBC, Sky Sports…you name it.

Frank Beacham: Are you finding it difficult to educate broadcasters about this relatively new technology?

Nick Hayes: While many broadcasters have been doing graphics and green screens for a long time, when you introduce camera tracking and augmented reality it can be hard for some to wrap their head around. Many large broadcasters are already using it. But there’s a learning curve to all new technology and it takes a more measured approach to embrace it.

Frank Beacham: Thank you, Nick.

(Editors Note: The Broadcast Beat television production studio in Ft. Lauderdale, Florida, will soon be offering Ncam technology to its clients.)

Writer at Broadcast Beat
Frank Beacham is a New York-based writer, director and producer who works in print, radio, television, film and theatre.

Beacham has served as a staff reporter and editor for United Press International, the Miami Herald, Gannett Newspapers and Post-Newsweek. His articles have appeared in the Los Angeles Times, Washington Post, the Village Voice and The Oxford American.

Beacham’s books, Whitewash: A Southern Journey through Music, Mayhem & Murder and The Whole World Was Watching; My Life Under the Media
Microscope are currently in publication. Two of his stories are currently being developed for television.

In 1985, Beacham teamed with Orson Welles over a six month period to develop a one-man television special. Orson Welles Solo was canceled after Mr. Welles died on the day principal photography was to begin.

In 1999, Frank Beacham was executive producer of Tim Robbins’ Touchstone feature film, Cradle Will Rock. His play, Maverick, about video with Orson Welles, was staged off-Broadway in New York City in 2019.
Frank Beacham
Broadcast Beat - Production Industry Resource