Gestural interaction has become commonplace in consumer electronics. Finger gestures captured on touch screens provide intuitive ways to interface with complex tasks, and some gestures such as pinch-zoom have become iconic. Most techniques for coding gesture are based on forms of activity detection that involve recognition of gesture in unitary form. Meanwhile users’ casual perception of the potential of gestural input goes beyond simple recognition. People imagine gestural interaction to be intuitive and continuous, where aspects of a gesture organically map onto the response of an interactive system. What are the ways in which we might capture gesture quality to enable forms of continuous interaction?
To investigate this, we first proposed to study how people are able to control stroke gesture variation (such as size, speed, orientation). We defined a gesture vocabulary of 2D shape gestures and we asked participants to perform these gestures with and without variations in size, speed and orientation. Hence the study relies on a purely motor task. We found that this is possible if the gesture is performed slowly. In this case, characteristics are fairly independent (non-ballistic movements).
If people are able to deliberately control some gesture variations, these variations could be an expressive vector for expressive interaction. The next step is to be able to track these variations. We had an machine learning based algorithm to estimate gesture variation as well as performing real-time gesture recognition (the Gesture Variation Follower).
Using such technology, we finally proposed to study how people rate the attractiveness of a real-world application that makes use of gesture-based continuous interaction. The application emulates a PhotoBooth-like App. We used our algorithm to recognise the gesture and estimate the size and the time-progression within the gesture. These two characteristics were used to change image FX processing parameters. In other words, the gesture is used to select the effect, the variation is used to modulate the effect.
Publication
We presented the paper at CHI 2013 in Paris:
- B. Caramiaux, F. Bevilacqua, and A. Tanaka. Beyond recognition: using gesture variation for continuous interaction. Proceedings of the Extended Abstracts on Human Factors in Computing Systems (CHI EA, alt.CHI). pp.2109-2118, 2013