CGMatter runs through how to create an audio visualizer with Blender, where objects auto-react to sound.
In a record pace, CGMatter covers creating an audio visualizer (of sorts) using Blender. The system will raise and lower an array of objects automatically triggered by the volume of sound.
Parenting and Deforming.
The tutorial uses a plane that has another object parented to it. Then instancing that object on the plane vertices and altering the scale sets up a quick array. A key benefit is that the original plane used to make the array for the audio visualizer will also pass down it’s animation to the object array too.
Animating the Plane.
To finalize the audio visualizer animation, it’s then a matter of animating the plane to have the hight of the array objects move accordingly. Baking the sound to F-Curves will automate the motion to sound.
Another Method That Uses Animation Nodes.
Sack Hixon once walked through using the Sound Falloff Node for visualizing audio within an animation. Sound Falloff can take a digital audio clip and create a falloff that assigns a value to every object using the sound frequency. Interestingly, the falloff for the sound nod can be calculated based on the index of objects, or based on another falloff.
Need a Slower Version.
If the pace of the tutorial is too fast for you to keep up, CGMatter makes the more extended version available by becoming a patron. You can also get a version of the project file too. Check out CGMatter on Patreon here.