1 |
Procedural Music Generation and Adaptation Based on Game StateAdam, Timothey Andrew 01 June 2014 (has links) (PDF)
Video game developers attempt to convey moods to emphasize their game's narrative. Events that occur within the game usually convey success or failure in some way meaningful to the story's progress. Ideally, when these events occur, the intended change in mood should be perceivable to the player. One way of doing so is to change the music. This requires musical tracks to represent many possible moods, states and game events. This can be very taxing on composers, and encoding the control flow (when to transition) of the tracks can prove to be tricky as well.
This thesis presents AUD.js, a system developed for procedural music generation for JavaScript-based web games. By taking input from game events, the system can create music corresponding to various Western perceptions of music mood. The system was trained with classic video game music. Game development students rated the mood of 80 pieces, after which statistical representations of those pieces were extracted and added into AUD.js. AUD.js can adapt its generated music to new sets of input parameters, thereby updating the perceived mood of the generated music at runtime.
We conducted A/B tests comparing static music, both composed and computer-generated, to dynamically adapting music. We find that AUD.js provides reasonably effective music for games, but that adaptiveness of the music does not necessarily improve player experience over composed music. By conducting a user study during Global Game Jam 2014, we also find that since AUD.js provides a software solution to music composition, it can be a useful tool for game music integration under time pressure.
|
2 |
Dynamic Procedural Music Generation from NPC AttributesWashburn, Megan E 01 March 2020 (has links)
Procedural content generation for video games (PCGG) has seen a steep increase in the past decade, aiming to foster emergent gameplay as well as to address the challenge of producing large amounts of engaging content quickly. Most work in PCGG has been focused on generating art and assets such as levels, textures, and models, or on narrative design to generate storylines and progression paths. Given the difficulty of generating harmonically pleasing and interesting music, procedural music generation for games (PMGG) has not seen as much attention during this time.
Music in video games is essential for establishing developers' intended mood and environment. Given the deficit of PMGG content, this paper aims to address the demand for high-quality PMGG. This paper describes the system developed to solve this problem, which generates thematic music for non-player characters (NPCs) based on developer-defined attributes in real time and responds to the dynamic relationship between the player and target NPC.
The system was evaluated by means of user study: participants confront four NPC bosses each with their own uniquely generated dynamic track based on their varying attributes in relation to the player's. The survey gathered information on the perceived quality, dynamism, and helpfulness to gameplay of the generated music. Results showed that the generated music was generally pleasing and harmonious, and that while players could not detect the details of how, they were able to detect a general relationship between themselves and the NPCs as reflected by the music.
|
Page generated in 0.085 seconds