Using Motion Expressive Variations for Augmented Piano Performance

piano

When performing a piece, a pianist’s interpretation is communicated both through the sound produced and through body gestures. We present PiaF (Piano Follower), a proto- type for augmenting piano performance by measuring gesture variations. We survey other augmented piano projects, several of which focus on gestural recognition, and present our prototype which uses machine learning techniques for gesture classification and estimation of gesture variations in real-time. Our implementation uses the Kinect depth sensor to track body motion in space, which is used as input data. During an initial learning phase, the system is taught a set of reference gestures, or templates. During performance, the live gesture is classified in real-time, and variations with respect to the recognized template are computed. These values can then be mapped to audio processing parameters, to control digital effects which are applied to the acoustic output of the piano in real-time. We discuss initial tests using PiaF with a pianist, as well as potential appli- cations beyond live performance, including pedagogy and embodiment of recorded performance.

 

A first articles has been presented at NIME 2014:

A. Van Zandt-Escobar, B. Caramiaux, and A. Tanaka. PiaF: A Tool for Augmented Piano Performance Using Gesture Variation Following. Proceedings of the International Conference on New Interfaces for Musical Expression (NIME). pp.167-170, 2014