What new possibilities does technological progress offer for our listening experience? The MetaPhase research project focuses on interaction between human and virtual musicians in a new world that encompasses both real and virtual space. As part of the European STARTS 2023 Award, the research has just received an honorable mention from the European Commission. MetaPhase was selected from over 1,600 submissions.
What's remarkable about this project is that the movements and gestures of the performers are captured with high precision by a motion tracking system and are embodied on screen in the form of a digital avatar. This allows the audience to be swept along with every detail of the relationship between gesture and sound.
MetaPhase is the artistic result of a collaboration between Giusy Caruso and the innovative Italian start-up LWT3. Caruso is a concert pianist and postdoctoral researcher who specializes in studying human-machine interaction (HMI) for creating futuristic multimedia formats and analyzing musical performances. As chair of the CREATIE research group at the Royal Conservatoire Antwerp and affiliated researcher at IPEM-Ghent University, she is the official music advisor to LWT3 Society Milan. LWT3 focuses on data analysis, visualization, IoT infrastructure development, and human-machine interaction solutions.
This collaborative research was driven to showcase the creative potential of data processing, human-machine interaction, and biotechnological applications
in an XR performance. The goal was to enhance the expressiveness of the performers and the listening experience of the audience. The core is a wearable, user-friendly system (prototype) developed by LWT3 that is capable of collecting biosignals.
The system maps gestures and provides quantitative data on displacement, acceleration, and speed of movements. In this way, musicians gain valuable information about their physical approach to playing an instrument. This helps the musician better identify opportunities for improvement.
Based on the movements of musicians, the system also creates a virtual agent (usually an avatar) that moves on stage. The musicians are equipped with reflective markers on their body and a biosensor that captures gestures and associated muscle effort in real time via infrared cameras. In summary, this form of co-creation of live music contains the first experiments with OptiTrack motion tracking, LWT3's biometric signal devices, VR technology, and the use of a Yamaha Disklavier piano.
From this perspective of visualizing physical measurements, Giusy Caruso and LWT-3 developed the idea of also creating digitized performances. For example, she plays Steve Reich's Piano Phase for two pianos through interaction with another avatar pianist who plays the first part of the piece. This second virtual person is animated by the expressive movement of the real pianist recorded earlier together with an audio track on a Disklavier piano.
The project underscores that we are in an era where increasingly more attention is being paid to hybrid approaches (physical+digital) in hybrid spaces (real+virtual). Against the odds, people have become accustomed to living and communicating in virtual environments via smartphones or laptops. With the development of AR/VR projections, people can now also create interactions of their avatar in the 'metaverse' (a virtual world with avatars and tokens). This metaverse is only at the beginning of a development that still holds many mysteries and uncertainties but continues to evolve step by step.






