Share

Live Broadcast Animation with Adobe

Pivotal moments in film and television production define a shift in the industry or provide innovative options for creators. When it comes to animation, there are plenty of tools and software options to choose from. Tablets and animation software have taken traditional hand-drawn techniques into the digital landscape, pushing the envelope of what can be accomplished.

Adobe Animate is one of those tools, providing a digital cell animation solution that integrates with the existing Adobe ecosystem. Adobe also wields another tool that takes animation in a different direction and presents functionality not seen in the other offerings.

Enter Adobe Character Animator, a software platform that establishes unprecedented character control, streamlines the workflow, and allows creators to animate their characters in real-time in live television broadcasts.

What is Adobe Character Animator?

Adobe Character Animator is one of two animation tools offered by Adobe software. Animate falls in line with traditional methods of hand-drawn characters in a frame by frame—cell animation—process. Character Animator, on the other hand, works with digital “puppets” pre-built in either Photoshop or Illustrator.

These puppets are imported into Character Animator, where they are rigged with a virtual bone or joint system. This allows creators to generate articulated features allowing puppets to move smoothly. Adobe hosts a palette of controls to define joint rules, anchor points, gravity, puppet flexibility, and interaction with other objects.

Once a puppet is created and rigged, it is ready for production, awaiting direction like a digital actor. Character movements can be programmed and applied to triggered controls or motion capture.

Why is this a game-changer? Instead of repeatedly drawing a character frame by frame, scenes can be “acted” out. This generates a leaner, faster, and more dynamic workflow than traditional methods.

However, there is a tradeoff. This type of animation works exceptionally well for dialogue-based works and staging primarily composed of lateral screen motion. Dynamic action movements—such as fighting or other vigorous actions—and characters moving toward and away from the screen are still possible but require some creative workarounds.

Full scenes can be composed within Character Animator itself, although bringing it into a more robust layering application such as Adobe After Effects affords more efficient composition.

Puppet Rigging and Controls

The heart of the animation system in Character Animator resides in the puppet rigging system. While Photoshop and After Effects have their own puppet rigging mechanisms, they don’t come near the expansive options and controls featured in Character Animator.

Building the puppets is a crucial step in the process, as they must be constructed to a specific hierarchy. Body parts are made as separate layer elements and arranged in a particular order, and each layer is named with a prerequisite format. This formatting allows Character Animator to recognize the layers and apply the appropriate functions. The software knows how to handle eyes differently than legs.

Also, building puppets using graphic editing software lets you get creative with your character’s appearance. You can opt for the traditional cell-shaded or vector graphic aesthetic, or you can get crazy and fashion layers that give your creations photorealistic attributes. Some clever animators have taken photographs of clay-modeled puppets or pre-rendered CGI models and used them to generate characters with stop motion or CGI attributes while still utilizing the streamlined animation model.

Once the puppets are within the Character Animator, you can apply the bone rigging system. This digital skeleton system is more than just defining flex points in your character. It has an entire reverse kinematics system in place. To have your character raise their hand, you can adjust the shoulder, then the elbow, and then the hand to set your positions, or you can simply grab the hand and drag it with the mouse, and the rest of the limb will follow naturally—provided you rigged the puppet correctly.

You can also set triggers, predetermined positions, and motions mapped to hotkeys. Do you want your character to wave, put their hands on their hips, or stand in a Karate Kid crane kick stance? Assign the end positions as triggers to hotkeys, and then when you click “record,” hit those keys, and your character will toggle between the positions, with Character Animator filling in the motion for you.

Motion Tracking

This feature is the real show-stopper. In addition to rigging your puppet ahead of time and determining specific motions, you can turn on the motion tracking mechanism which maps your own movement to the puppet in real-time. The best part is that it doesn’t require any fancy cameras, cables, or mocap suits. Using just the webcam on your computer, Character Animator can extrapolate the data and apply it directly to the puppet.

Walk back and forth, dance, wave your arms and watch your puppet perform with you. The facial motion capturing is exceptionally intricate and dynamic and can read even the most subtle movements. This includes eyebrow motion, eyeball directionality, facial expressions, and lip-syncing. If you’ve included layers for different mouth positions—phonemes—then go ahead and talk while you’re recording; your digital counterpart’s mouth will follow suit.

A recent update to the software includes rotational tracking. If you construct your puppet to have multiple angles, Character Animator can automatically switch between positions as you rotate your head and body on camera, matching the correct perspective.

Live Broadcast Animation

In another feature only Character Animator brings, you can have your digital puppet perform live during a televised broadcast. Build your puppet, rig them with motions, and export a live stream from the application into a switcher (puppets have a transparent alpha channel). You can now overlay an animated character over live footage and have them interact in real-time.

This feature was demonstrated in 2016 on an episode of The Late Show in which host Stephen Colbert interviewed a cartoon version of Donald Trump. Using preset triggers and pre-recorded motions, animators could actively respond using hotkeys, creating the illusion that Colbert was interacting with the puppet live.

This would lay the groundwork for the animated show, My Cartoon President, which was also created using Adobe Character Animator. Show creators could utilize the incredibly efficient animation tools offered by the application, shaving months off the production of each episode. This workflow provided an opportunity to animate responses to relevant and time-sensitive topics.

Adobe continues to push out major updates and new features to the application, which erects Character Animator as a significant pillar in the architecture of modern animation.

Broadcast Beat - Production Industry Resource