AES E-Library

When XR Meets AI: Integrating Interactive Machine Learning with an XR Musical Instrument

This paper explores the integration of artificial intelligence (AI) with extended reality (XR) through the development of Netz, an XR musical instrument (XRMI) designed to enhance expressive control using deep learning techniques. Netz implements algorithms to map physical gestures to musical controls, offering customisable control schemes that enhance gesture interpretation accuracy and elevate the overall musical experience. The instrument was developed through a participatory design process involving a professional keyboard player and music producer. The process spanned three phases with corresponding design sessions: exploration, making, and performance & refinement. Initial challenges with traditional computational approaches to hand-pose classification were overcome by incorporating an interactive machine learning (IML) system, enabling personalised gesture control. A set of musical performance tasks encompassing melodies and chord progressions were used to assess the instrument’s playability and expressivity in collaboration with our musician partner. Thematic analysis of reflective interviews revealed that the IML system enhanced musical interaction, suggesting AI’s potential to improve XR musical performance. Future work will involve a wider range of musicians to assess the generalisability of our findings.

 

Author (s):
Affiliation: (See document for exact affiliation information.)
Publication Date:
Permalink: https://aes2.org/publications/elibrary-page/?id=22436


(12570KB)


Download Now

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member Join the AES. If you need to check your member status, login to the Member Portal.

Type:
E-Libary location:
16938
Choose your country of residence from this list:










Skip to content