Our Projects

AIM develops AI‑powered tools for music creation and performance, from automatic transcription and evaluation to expressive visuals and robot cello performance.

Automatic Music Transcription

Fall 2024 - Present

Automatic Music Transcription is a project focused on streamlining audio-to-MIDI transcription for musicians and educators, with applications in isolating sounds in noisy environments. We are conducting a systematic review of AMT models, examining their strengths and limitations with complex, multi-instrument music. To drive innovation, we’re hosting a competition in April 2025, challenging participants to create accurate transcription models for classical music. This project aims to drive advancements in AMT and refine current transcription methods.

Evaluator

Fall 2023 - Present

Evaluator is an app that aims to help musicians practice more effectively. It utilizes computer vision and YOLO localization techniques to help musicians track, analyze, and improve their posture. It also uses spectrogram analysis and multi-modal transformers to help the musicians identify their mistakes in music and correct them.

Drawing by Cecilia Ines Sanchez.

Companion

Fall 2023 - Present

Companion is an app that not only plays along with a human player during a chamber music piece, but actively responds to their playing habits and voice commands like a real human would. The project involves machine learning and filtering/DSP algorithms to analyze and edit sound quickly and accurately and utilizes small NLP language models for voice command implementation.

Drawing by Cecilia Ines Sanchez.

Mus2Vid

Spring 2022 - Present

Music is a powerful and universal language that has the unique ability to unite people from diverse backgrounds and cultures. Yet, for those who are hearing-impaired, this enchanting language remains largely inaccessible. Our ambitious project seeks to break down these barriers by embarking on a mission to visualize music, making it not only accessible but also a source of immersive and interactive entertainment for everyone, regardless of their hearing abilities.

Robot Cello

Spring 2024 - Present

As the name suggests, Robot Cello is a project about using reinforcement learning to teach a robot arm to play cello. The project is currently in its survey phase but is currently investigating using motion capture technology to get training data for an RL model.

We partner with the Purdue Envision Center to collect motion data for our robot arm to train on. On the left is a video of Prof. Yun playing cello while wearing a motion-capture rig.