Spelling suggestions: "subject:"dew interfaces for mmusical 1expression"" "subject:"dew interfaces for mmusical dexpression""
1 |
Musical XR & Spatial Audio: Developing Virtual Environments for Research in Music Composition and PerformanceTomasetti, Matteo 13 December 2024 (has links)
This thesis explores the convergence of spatial audio and Extended Reality (XR) technologies to advance the field of music composition and performance. It addresses fundamental gaps in the understanding and application of these technologies in creating immersive musical experiences. The research focuses on the development and evaluation of several innovative virtual environments that facilitate new approaches to musical interaction and expression.
Key contributions of this work include the development of single-user and multi-user virtual environments that integrate spatial audio techniques, primarily focusing on Ambisonics and binaural audio, along with various gesture-based techniques. These environments equip musicians and composers with novel tools for sound exploration. The thesis also offers an in-depth analysis of user interaction within these environments, employing both quantitative and qualitative research methods to evaluate the impact of these technologies on musical creativity and performance. Furthermore, the creation and dissemination of open-source software and Virtual Reality systems and interfaces, along with their open-source code, encourage broader experimentation and development within the musical XR field.
Through a blend of experimental studies, system developments, and review analysis, this work not only proposes new design guidelines and enhances the use of spatial audio in the various fields proposed, but also provides practical systems, interfaces, and tools for musicians, composers, sound designers, sound engineers, performers, and researchers. These efforts have been documented in eleven peer-reviewed publications in top-tier journals and presented at high-level international conferences, highlighting the practical relevance and academic contribution of this thesis.
|
2 |
MusE-XR: musical experiences in extended reality to enhance learning and performanceJohnson, David 23 July 2019 (has links)
Integrating state-of-the-art sensory and display technologies with 3D computer graphics, extended reality (XR) affords capabilities to create enhanced human experiences by merging virtual elements with the real world. To better understand how Sound and Music Computing (SMC) can benefit from the capabilities of XR, this thesis presents novel research on the de- sign of musical experiences in extended reality (MusE-XR). Integrating XR with research on computer assisted musical instrument tutoring (CAMIT) as well as New Interfaces for Musical Expression (NIME), I explore the MusE-XR design space to contribute to a better understanding of the capabilities of XR for SMC.
The first area of focus in this thesis is the application of XR technologies to CAMIT enabling extended reality enhanced musical instrument learning (XREMIL). A common approach in CAMIT is the automatic assessment of musical performance. Generally, these systems focus on the aural quality of the performance, but emerging XR related sensory technologies afford the development of systems to assess playing technique. Employing these technologies, the first contribution in this thesis is a CAMIT system for the automatic assessment of pianist hand posture using depth data. Hand posture assessment is performed through an applied computer vision (CV) and machine learning (ML) pipeline to classify a pianist’s hands captured by a depth camera into one of three posture classes. Assessment results from the system are intended to be integrated into a CAMIT interface to deliver feedback to students regarding their hand posture. One method to present the feedback is through real-time visual feedback (RTVF) displayed on a standard 2D computer display, but this method is limited by a need for the student to constantly shift focus between the instrument and the display.
XR affords new methods to potentially address this limitation through capabilities to directly augment a musical instrument with RTVF by overlaying 3D virtual objects on the instrument. Due to limited research evaluating effectiveness of this approach, it is unclear how the added cognitive demands of RTVF in virtual environments (VEs) affect the learning process. To fill this gap, the second major contribution of this thesis is the first known user study evaluating the effectiveness of XREMIL. Results of the study show that an XR environment with RTVF improves participant performance during training, but may lead to decreased improvement after the training. On the other hand,interviews with participants indicate that the XR environment increased their confidence leading them to feel more engaged during training.
In addition to enhancing CAMIT, the second area of focus in this thesis is the application of XR to NIME enabling virtual environments for musical expression (VEME). Development of VEME requires a workflow that integrates XR development tools with existing sound design tools. This presents numerous technical challenges, especially to novice XR developers. To simplify this process and facilitate VEME development, the third major contribution of this thesis is an open source toolkit, called OSC-XR. OSC-XR makes VEME development more accessible by providing developers with readily available Open Sound Control (OSC) virtual controllers. I present three new VEMEs, developed with OSC-XR, to identify affordances and guidelines for VEME design.
The insights gained through these studies exploring the application of XR to musical learning and performance, lead to new affordances and guidelines for the design of effective and engaging MusE-XR. / Graduate
|
3 |
Musiques pour éponge : la composition pour un nouvel instrument de musique numériqueMarier, Martin 05 1900 (has links)
No description available.
|
Page generated in 0.1299 seconds