Return to search

Procedural Music Generation and Adaptation Based on Game State

Video game developers attempt to convey moods to emphasize their game's narrative. Events that occur within the game usually convey success or failure in some way meaningful to the story's progress. Ideally, when these events occur, the intended change in mood should be perceivable to the player. One way of doing so is to change the music. This requires musical tracks to represent many possible moods, states and game events. This can be very taxing on composers, and encoding the control flow (when to transition) of the tracks can prove to be tricky as well.
This thesis presents AUD.js, a system developed for procedural music generation for JavaScript-based web games. By taking input from game events, the system can create music corresponding to various Western perceptions of music mood. The system was trained with classic video game music. Game development students rated the mood of 80 pieces, after which statistical representations of those pieces were extracted and added into AUD.js. AUD.js can adapt its generated music to new sets of input parameters, thereby updating the perceived mood of the generated music at runtime.
We conducted A/B tests comparing static music, both composed and computer-generated, to dynamically adapting music. We find that AUD.js provides reasonably effective music for games, but that adaptiveness of the music does not necessarily improve player experience over composed music. By conducting a user study during Global Game Jam 2014, we also find that since AUD.js provides a software solution to music composition, it can be a useful tool for game music integration under time pressure.

Identiferoai:union.ndltd.org:CALPOLY/oai:digitalcommons.calpoly.edu:theses-2299
Date01 June 2014
CreatorsAdam, Timothey Andrew
PublisherDigitalCommons@CalPoly
Source SetsCalifornia Polytechnic State University
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourceMaster's Theses

Page generated in 0.0012 seconds