Atom demonstrates an interesting concept that uses the free and open source lip-syncing program called Papagayo (that aids in creating phonemes with the sound of recorded tracks) to create lip sync phonemes that can be imported and used in After Effects.

Here, Atom is showing an AE Javascript he wrote (could not find a link to this) for converting the Papagayo MOHO file format into something that After Effects can understand.

Papagayo is a lip-syncing program designed to help you line up phonemes (mouth shapes) with the actual recorded sound of actors speaking. Papagayo makes it easy to lip sync animated characters by making the process very simple – just type in the words being spoken (or copy/paste them from the animation’s script), then drag the words on top of the sound’s waveform until they line up with the proper sounds.

Atom writes: This is a short video to demonstrate the new Adobe After Effects javascript I wrote. This javascript reads the MOHO.dat file format, generated by the open source Papagayo software. The work flow is simple. You import your voice over audio track into Papagayo and do your initial voice synch using that tool. When you have a nice synch with the talking mouth you export the MOHO.data file for that particular take. In After Effects you run the script that reads the MOHO.data file. This script will generate time remap keyframes that match the values in the script. So if there are revisions to the voice over, you can re-synch, export and re-run the script to produce a new synch.

1 comment

Comments are closed.