AIM at NeurIPS 2023

On 10-16 December, several AIM researchers will participate at the Thirty-seventh Conference on Neural Information Processing Systems (NeurIPS 2023), taking place in New Orleans, USA. NeurIPS is a world-leading conference in AI, machine learning, and computational neuroscience, attracting more than ten thousand attendees annually. The AI and Music centre for doctoral training will have a strong presence at NeurIPS 2023.

In the Main Conference and specifically its Datasets and Benchmarks track, the following paper is authored by AIM members:

  • MARBLE: Music Audio Representation Benchmark for Universal Evaluation (Ruibin Yuan, Yinghao Ma, Yizhi Li, Ge Zhang, Xingran Chen, Hanzhi Yin, Zhuo Le, Yiqi Liu, Jiawen Huang, Zeyue Tian, Binyue Deng, Ningzhi Wang, Chenghua Lin, Emmanouil Benetos, Anton Ragni, Norbert Gyenge, Roger Dannenberg, Wenhu Chen, Gus Xia, Wei Xue, Si Liu, Shi Wang, Ruibo Liu, Yike Guo, Jie Fu)

In the NeurIPS Machine Learning for Audio Workshop:

  • AIM PhD student Ben Hayes is giving an invited talk on Differentiable digital signal processing (DDSP).

And the following papers are authored by AIM members:

See you at NeurIPS!


AIM at ISMIR 2023

On 5-9 November 2023, several AIM researchers will participate at the 24th International Society for Music Information Retrieval Conference (ISMIR 2023). ISMIR is the leading conference in the field of music informatics, and is currently the top-cited publication for Music & Musicology (source: Google Scholar). This year ISMIR will take place in Milan, Italy, and online.

This year, the AI and Music CDT is involved in the organisation of the conference, while also having a strong presence at the tutorials and scientific programme of the conference.

In the Scientific Programme, the following papers are authored/co-authored by AIM students:

The following Tutorials will be presented by AIM students:

On Satellite Events, AIM PhD student Elona Shatri is general co-chair for the 5th International Workshop on Reading Music Systems (WoRMS 2023). Additionally, she will showcase two papers she collaborated on with master’s students: Towards Artificially Generated Handwritten Sheet Music Datasets (Pranjali Hande, Elona Shatri, Benjamin Timms, and George Fazekas) Improving Sheet Music Recognition using Data Augmentation and Image Enhancement (Zihui Zhang, Elona Shatri, and George Fazekas)

Finally, AIM PhD student Ilaria Manco is Sponsorship/Industry co-chair for ISMIR 2023.

See you at ISMIR!


AIM PhD students win at the MIDI Innovation Awards

Congratulations to two AIM PhD students, Andrea Martelloni and Max Graf, who have both won MIDI Innovation Awards 2023 in the hardware prototype and software prototype categories, respectively.

Two AIM PhD students have each won an award at this year’s MIDI Innovation Awards. Now in its 40th year, the MIDI Awards showcase products and projects that are using MIDI (Musical Instrument Digital Interface) in fresh and original ways, highlighting the role that MIDI technology has to play in enabling musical creativity.

Andrea Martelloni won the hardware prototype category for the HITar, an augmented percussive guitar which Andrea developed under the supervision of Dr Mathieu Barthet and Professor Andrew McPherson. Aimed at percussive fingerstyle guitarists, the HITar is a device that can be fitted to a regular acoustic guitar and can alter the way a player interacts with the instrument’s body. The unit employs sensors placed underneath the areas most commonly struck by players and uses an AI engine to determine which part of the hard is used for each percussive hit. The resulting MIDI output can then be used to integrate hardware or software instruments or samples into a performance, allowing guitarists to trigger drum and percussion samples or blend sample libraries and virtual instruments with their playing.

In the software category, Max Graf won for Netz, a mixed reality (MR) software instrument that blends the real and virtual worlds by displaying 3D virtual objects within a real environment. It’s a self-contained instrument that features an embedded sound engine, allowing users to produce sounds directly from the head-mounted display. The software’s interface appears as a network where nodes represent notes, which can be mapped to a tangible surface for tactile feedback or be positioned in the air. Performers’ hand poses and gestures are tracked in in real time, allowing Netz to translate subtle hand movements to expressive musical controls; gestures such as the opening and closing of fingers — or movements such as wrist rotation — are recognised and interpreted by the system and can be assigned to specific instrument parameters and MIDI controls.

Many congratulations from everyone in AIM to Andrea, Max and their supervisors for this huge achievement!

More information: