Friday, April 4, 2008

Gesture Recognition Using an Acceleration Sensor and Its Application to Musical Performance Control

Summary:

Sawada and Hasimoto use accelerometer data to extract features of gestures and create a music tempo system.

The extracting of features is basic: projections onto certain planes, such as xy or yz, and the bounding box of the acceleration values. Changes of acceleration are measured using a fuzzy partition of radial angles.

The authors recognize or classify gestures using squared error. The actual gesture recognition is trivial.

The music tempo program is where the paper is more interesting as the system has to predict where a beat has been hit in real-time. Systems already existed where a marker is placed on a baton, but the visual processing of these systems usually has a delay of 0.1s (in 1997 computational power). In the author's system, gestures for up, down, and diagonal swings are used to indicate tempo. Other gestures can map to other elements of conducting.

A score is stored in the computer and the user conducts to the score. Often the computer and human are slightly off, and the two try to balance to each other. A simple function for balancing the tempo is given.


Discussion:

The system they use isn't a true conducting system since it relies on defined (and trained) gestures, but the ideas behind the tempo system are good and the simple execution and equations are appreciated.

No comments: