Share

The Workflow of Sound Post-Production for Video

The creation of soundtracks for video requires a unique workflow that differs from the video side of the equation. From the acquisition of original sound in the field, to the creation of sound effects and foley, to constructing music beds to final editing and mixing, the sound drives the video in any storytelling effort.

Orson Welles, with his extensive background in radio, brought many new sound techniques to motion picture sound. For example, in his classic feature, Citizen Kane, Welles used distorted audio to add meaning to a scene. When a reporter enters a library to look up information on Kane, the sound becomes a distorted echo. It implies the reporter is not just in any library — but in a tomb. The sound effect represents a man who was cold and empty in life as well as death. Welles used the sound to help propel his story.

Good sound, of course, goes far beyond just recording original audio in the field. It begins — depending on the production — with creative sound design. This pre-determines the essential elements of the sound track before composing them into an organic whole.

The next step is editing — where the sound effects and foley are recorded and added and automatic dialogue replacement (ADR) is performed as needed. The final stage is mixing, which is the balancing of the tracks, processing and sweetening of the audio. It takes all those raw elements, plus the music, to turn the soundtrack into a coherent, seamless mixed whole that propels the story.

This workflow can be confusing to beginners, since the same tools are often used in all three phases. But the results at each stage are decidedly different. At a glance, editing and mixing appear quite similar, though they produce unique results.

The big difference between digital and analog editing is the way sound is manipulated. Analog was a very manual process, using razor blades to cut and then splice physical audio tape. Today, in the digital domain, the whole process is done with a digital audio workstation (DAW) on a computer with mostly virtual plug-ins for processing. The early stages of audio editing involves cutting and deleting sound elements and assembling them into a clean sets of tracks.

Another big difference from analog is that digital editing is a visual process. Editing now is mostly accomplished by viewing waveforms on a computer display. This can have a down side, since relying solely on waveforms can result in awkward cuts that are sometimes too tight. Since most audio software editing is non-destructive, edits can be redone until enough breathing room is given to transitions. That wasn’t the case with analog.

As it has always been, good sound editing for video requires an aesthetic sense of rhythm and timing. Cuts and fades must occur at a price moment in sync with the video picture. While I was doing an offline edit of the final two Honeymooners episodes for Jackie Gleason in the 1970s, ¾-inch tape had an accuracy rate of plus or minus 30 frame per second.

Gleason, a stickler for precise timing in his comedy, demanded offline edits with an accuracy of within two frames per second. This meant I had to cut and recut every edit until the tape machines randomly aligned correctly. It was tortuous, numbing work!

“Timing is everything,” Gleason told me — in comedy, music and drama. From that experience, I learned that what Gleason wanted was very right!

Of course, another element of editing today is digital processing. This includes removing background noise, EQ, compression and employing limiting, noise gates, reverb, delay, chorus, flanger and phaser in combination to manipulate the sound. This can all be done with software plug-ins.

There are now endless ways to process sound. The editing workflow today is like living in a playpen with infinite possibilities. But, as with everything involving technical choices, less is always more. Good editors always use restraint with effects knowing it is very easy to go too far.

Experiment when needed, but in the end, using good judgement is of paramount importance for good sound. Foley and sound effects can come from the strangest places. Sound effects operators in the heyday of radio drama created some amazing effects with a simple trip to a hardware store.

Orson Welles placed a live microphone in the men’s restroom at a studio to mimic the sound of the sewers of Paris for a live radio broadcast of Les Misérables. John Houseman, his producer, recalled Welles experimenting with a variety of items in rehearsal in an attempt to duplicate the sound of a human head being severed in a guillotine. He settled on the sound of a cabbage being sliced with a meat cleaver!

In the final phase of the workflow, mixing occurs. The previously edited tracks are made into a whole. The sound, whether stereo or immersive, is then assembled and conformed with a video proxy file. This process consolidates the original work of the sound designer and the mixing engineer to bring their sonic vision to life.

Music — whether live instruments or synthesized sound — is critically important in the mix and the whole track must be synchronized precisely with the video cut.

Classic sound tracks from films often use more than 24 tracks, sometimes slaving several multi-track machines together. Most tracks begin with chunks and sections of sound. Every element, including individual instruments in the music score, usually have separate faders for the final mix to make certain every sound is properly positioned in the final mix. Tracks are typically mixed and bounced as stems, allowing better control of a massive number of tracks.

The making of compelling video sound tracks requires a creative imagination, a sense of how music works in stories and a trained ear for good audio. The master sound designer elevates his craft into an art form all its own.

Writer at Broadcast Beat
Frank Beacham is a New York-based writer, director and producer who works in print, radio, television, film and theatre.

Beacham has served as a staff reporter and editor for United Press International, the Miami Herald, Gannett Newspapers and Post-Newsweek. His articles have appeared in the Los Angeles Times, Washington Post, the Village Voice and The Oxford American.

Beacham’s books, Whitewash: A Southern Journey through Music, Mayhem & Murder and The Whole World Was Watching; My Life Under the Media
Microscope are currently in publication. Two of his stories are currently being developed for television.

In 1985, Beacham teamed with Orson Welles over a six month period to develop a one-man television special. Orson Welles Solo was canceled after Mr. Welles died on the day principal photography was to begin.

In 1999, Frank Beacham was executive producer of Tim Robbins’ Touchstone feature film, Cradle Will Rock. His play, Maverick, about video with Orson Welles, was staged off-Broadway in New York City in 2019.
Frank Beacham
Broadcast Beat - Production Industry Resource