The Basics of High Dynamic Range Video

High dynamic range, or HDR video, expands the range from the darkest portion to the lightest portion of a scene. HDR video allows brighter highlight representation, darker and more-detailed shadows and a wider array of colors.

HDR works by combining both the display and the video content to control the way the luminance and colors are represented. It does not improve a video display’s brightness, contrast or color capabilities. But it does allow compatible displays to receive a higher quality image source.

With HDR, brightness can be increased in small areas of the image without increasing the overall brightness. Examples are sparkling chrome or bright stars in a night sky. The shadows in an HDR image are darker and have more detail, while the color is enhanced and vivid.

HDR for video, also called HDR-TV, was first used in 2014. It is part of an end-to-end process of increasing the dynamic range of images and videos from their capture and creation, to their storage, distribution and display. If videos are HDR friendly, compatible displays will be able to portray more of the video intended by the image maker. Those watching on standard definition displays will lose nothing in the process.

HDR video offers more than brighter whites and deeper blacks. It also has the ability to deliver a peak brightness of up to 10,000 nits, a wider color gamut that meets the BT.2100 standard and smoother display of gradients between colors and shades in 10-bit video.

How does a videographer use HDR in video production?

While professional digital video cameras, such as those manufactured by RED and ARRI, have sensors with up to 17 stops of dynamic range, most displays only have a dynamic range of between eight and ten stops. However, any footage captured using a recent model video camera far exceeds what most displays can present.

Many modern video cameras, including recent iPhone models, have HDR recording modes. When enabled, HDR in an iPhone supports Dolby Vision, which helps the target screen make the most out of all the extra visual data.

For videographers, HDR should not be used all the time. It is important to understand when it works best and when to avoid using it.

HDR is best on well-lit scenes and backgrounds where it can bring all the colors and shadows to life. Sometimes backgrounds are poorly lit. In that case, HDR should be applied in post-production, which can punch out the dark areas.

Don’t use HDR to eliminate every shadow in an image. Removing all the shadows leaves an image with lower contrast than the original. Video needs the full spectrum of light and shadows to create depth and dimension. Authenticity matters.

It is poor practice to flatten the image by reducing the contrast between the original bright and dark areas. It makes the image look less natural, difficult to understand and not appealing. A flat HDR image shows little contrast across the scene and looks artificial and lifeless.

A trick for shooting HDR might typically occur in a sunset, when the range between brightness and darkness is far beyond the camera’s capability. One way to capture the scene is to do three identical shots at different exposures: one of the sunset at the time you wish to capture; one a few minutes later when the light is dimmer; and still another at full brightness. Then, in post, line up and combine the three scenes to create an HDR image.

If video is shot without turning on the HDR function or when working with video where it wasn’t used, there are still ways to get the effect in post. Though it adds a layer of complexity to post-production and grading, HDR also brings a new level of freedom.

For example, when shooting someone next to a window, the editor can keep all the information in the scene outside the window and still have a good exposure on the person inside. That would be very difficult to achieve without using HDR in post.

Additional factors when working in an HDR post workflow involve the source camera acquisition formats and the master delivery specifications. Where is the video coming from and what are the formats? Are you delivering in Rec 2020 or DCI-P3, Dolby Vision, HDR10 or HLG. These must be taken into account. Also, color management must be considered, especially if working with different cameras.

Having to derive multiple HDR formats from a master can influence the post workflow. This partly depends on who the primary deliverable is for and the standards of HDR requested by the client. There can be different color gamuts between the different deliverables.

Before color grading, software capable of HDR editing and output is needed. Every editing software program doesn’t have the built-in color grading tools for HDR. Check it first.

DaVinci Resolve

The most popular editing application for grading HDR is Blackmagic Design’s DaVinci Resolve. It’s compatible with Dolby Vision and HDR10 (using ST.2084) and Hybrid Log-Gamma (HLG). Adobe Premiere Pro, Avid Media Composer and Final Cut are also good options.

HDR is now common in the video industry and is growing daily as new consumer-capable displays are purchased and brought into homes. Experimentation with HDR will help the video operator see the do’s and don’ts of the technology. Whichever camera is being used, HDR can improve video quality.

Writer at Broadcast Beat
Frank Beacham is a New York-based writer, director and producer who works in print, radio, television, film and theatre.

Beacham has served as a staff reporter and editor for United Press International, the Miami Herald, Gannett Newspapers and Post-Newsweek. His articles have appeared in the Los Angeles Times, Washington Post, the Village Voice and The Oxford American.

Beacham’s books, Whitewash: A Southern Journey through Music, Mayhem & Murder and The Whole World Was Watching; My Life Under the Media
Microscope are currently in publication. Two of his stories are currently being developed for television.

In 1985, Beacham teamed with Orson Welles over a six month period to develop a one-man television special. Orson Welles Solo was canceled after Mr. Welles died on the day principal photography was to begin.

In 1999, Frank Beacham was executive producer of Tim Robbins’ Touchstone feature film, Cradle Will Rock. His play, Maverick, about video with Orson Welles, was staged off-Broadway in New York City in 2019.
Frank Beacham
Broadcast Beat - Production Industry Resource