Thanks to an unprecedented move by Adobe, the future of live animation is right at your fingertips. Introducing Character Animator.

One of the most groundbreaking and mind-blowing presentations to come out of Adobe Video World was the new Character Animator demonstration. By tracking movements on your face, this new application can simulate a live animated figure mimicking your facial expressions and head movements. However, in recent months, the focus of the platform has shifted in a much more revolutionary direction.

The Simpsons

Adobe Brings Live Animation to the Mainstream: The Simpsons

Image via Universal Studios

Originally, Character Animator was meant to be a specifically post-production application. Now the focus has shifted to live character animation, which Adobe demonstrated on a little show called The Simpsons. The historic series approached Adobe about a live episode that would feature Homer answering “live phone calls” and reacting to them.

This should have been an insanely difficult task to accomplish. But thanks to the use of Character Animator, it was no problem. This move by Adobe was unprecedented and represented a giant leap forward in the field of animation.

On top of matching facial movements (think eye and mouth movements), Character Animator also has the ability to follow your head in a swivel movement to the left and right. For The Simpsons, they created a facial expressions keyboard. So as the voice actor stands behind the mic, the animator can control the mouth and eye movements of Homer and other characters featured in the skit.

Simpsons LIVE BOARD CHARACTER ANIMATOR

Adobe Character Animator literally lets designers bring 2D characters to life. A professional animator or any designer can create a layered character in Photoshop CC or Illustrator CC, bring them into a Character Animator scene, and then act out the character’s movement in front of a webcam. Even subtle facial expressions show up instantly, along with recorded dialogue and other actions triggered by a few simple keystrokes. All of this combines to create animations that have real-world, real-time elements, as characters interact or as people interact directly with their favorite characters. Smile, and your character smiles right back at you. — via Adobe

The Late Show

The use of Character Animator in live streaming was put to full effect just a few weeks ago on The Late Show with Steven Colbert. The above video details the intricacies that go into making Donald Trump come to life and interact with Colbert.

Character Animator provides pre-made animation but also allows custom animation as well (which is what most people will work with). Ideally, you can make your own animation and release it to the world all in the same sitting. This new field of work is going to change the game in both animation and Augmented Reality live-streamed events.

Augmented Reality

The wildly popular Snapchat app is making use of Character Animator-style effects. Whatever your opinion of the social media platform, one thing is certain — its move into Augmented Reality is way ahead of the pack. Millions of people use Snapchat on a daily basis and one of the big features of the app is cartoon and 3D masks applied to your face through facial mapping.

Adobe released this excellent breakdown of using Character Animator and the simpleness of its very nature. Check it.

As you can see, the app is fairly simple to operate. Quickly starting out, you are shown an interactive tutorial that will immediately explain how to work the application. They realize people have limited experience in this field and want to help.

Be on the lookout in the coming months for Character Animator to be rolled into the main application lineup for the Adobe Creative Suite. Even if you have no interest in animation, playing around with this app is incredibly enlightening and mind-blowing.


Could you see yourself using Character Animator in the future? Let us know in the comments below!