This paper presents the architecture of an Extended Reality (XR) system for real-time streaming and spatial rendering of live music performances, along with a comprehensive setup for the acquisition of multichannel audio and three-dimensional performer motion data. The system enables remote users to experience live performances in immersive virtual environments, where each musician is represented by an avatar. Audio input is captured via a digital mixing console, enabling the transmission of synchronized multitrack audio streams using low-latency networking protocols. These streams are spatially rendered within a 3D scene using advanced audio spatialization techniques integrated into standard real-time engines such as Unity. The system also supports bidirectional interaction: audience-generated audio feedback (e.g., cheering, clapping, or singing) is captured, spatially processed, and reintroduced into the performance environment, enhancing performer-audience engagement through immersive, real-time crowd response. The architecture was set-up and initially tested in a real concert environment at the Rockstadt Club in Bra s, ov, Romania, with the occasion of a live blues concert in 2024.

Live Feedback for Immersive Music Performances - A Case Study

Fadda, Gianluca;Popescu, Vlad
;
Murroni, Maurizio;
2025-01-01

Abstract

This paper presents the architecture of an Extended Reality (XR) system for real-time streaming and spatial rendering of live music performances, along with a comprehensive setup for the acquisition of multichannel audio and three-dimensional performer motion data. The system enables remote users to experience live performances in immersive virtual environments, where each musician is represented by an avatar. Audio input is captured via a digital mixing console, enabling the transmission of synchronized multitrack audio streams using low-latency networking protocols. These streams are spatially rendered within a 3D scene using advanced audio spatialization techniques integrated into standard real-time engines such as Unity. The system also supports bidirectional interaction: audience-generated audio feedback (e.g., cheering, clapping, or singing) is captured, spatially processed, and reintroduced into the performance environment, enhancing performer-audience engagement through immersive, real-time crowd response. The architecture was set-up and initially tested in a real concert environment at the Rockstadt Club in Bra s, ov, Romania, with the occasion of a live blues concert in 2024.
2025
Extended Reality; Multi-Channel Audio: Virtual Environment
File in questo prodotto:
File Dimensione Formato  
35222-2714-28321-1-10-20250522.pdf

accesso aperto

Descrizione: paper online
Tipologia: versione editoriale (VoR)
Dimensione 1.24 MB
Formato Adobe PDF
1.24 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/464967
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact