Researchers from the Institute of Science Tokyo and Sony Computer Science Laboratories have recently published a new study that investigates the use of an AI system that can reconstruct fine hand muscle activity using only video footage of musicians playing the piano.

Prior to this study, comparable investigations into muscle movement required the use of electrodes on the skin. This technique could only measure gross motor movements with large muscles, and also, the variation in anatomy between individuals made it difficult to draw generalized conclusions from these datasets.

The researchers built a deep-learning framework for the task and trained it on a comprehensive dataset using recordings of professional pianists. The new system will offer a low-cost, non-invasive method for analyzing fine motor control,  which will help to optimize rehabilitation strategies, enhance performance training, and inform future developments in human-machine interaction.

Titled PianoKPM, the dataset captures how professional pianists move, press, and control their hands with precision. It includes 12.6 hours of synchronized data from 20 pianists performing seven distinct musical tasks. Each performance was recorded with multi-view videos at 60 frames per second, 3D hand poses, 1 kHz keystroke data, audio, and 2 kHz EMG signals from six small hand muscles. The dataset contains more than five million pose frames and 28 million EMG samples, creating the first detailed map linking visible motion with internal muscle activity.

"Leveraging this dataset, we propose PianoKPM Net to infer high-frequency EMG from pose data," said Professor Hideki Koike, who led the study.

"Together, the PianoKPM Net and PianoKPM dataset create a foundation for affordable access to internal physiological and muscle activity signals, supporting progress in human augmentation and advanced human–machine interaction."