Return to search

Study on the Procedural Generation of Visualization from Musical Input using Generative Art Techniques

The purpose of this study was to create a new method for visualizing music. Although many music visualizations already exist, this research was focused on creating high-quality, high-complexity animations that cannot be matched by real-time systems. There should be an obvious similarity between the input music and the final animation, based on the music information which the user decides to extract and visualize. This project includes a pipeline for music data extraction and creation of an editable visualization file.

Within the pipeline, a music file is read into a custom analysis tool and time-based data is extracted. This data is output and then read into Autodesk Maya. The user may then manipulate the visualization as they see fit using the tools within Maya and render out a final animation.

The default result of this process is a Maya scene file which makes use of the dynamics systems available to warp and contort a jelly-like cube. A variety of other visualizations may be obtained by mapping the data to different object attributes within the Maya interface. When rendered out and overlaid onto the music, there was a recognizable correlation between elements in the music and the animations in the video.

This study shows that an accurate musical visualization may be achieved using this pipeline. Also, any number of different music visualizations may be obtained with relative ease when compared to the manual analysis of a music file or the manual animation of Maya objects to match elements in the music.

Identiferoai:union.ndltd.org:tamu.edu/oai:repository.tamu.edu:1969.1/ETD-TAMU-2011-05-9113
Date2011 May 1900
CreatorsGarcia, Christopher
ContributorsGalanter, Philip
Source SetsTexas A and M University
Languageen_US
Detected LanguageEnglish
Typethesis, text
Formatapplication/pdf

Page generated in 0.0026 seconds