Thanks to the support of ERC, the Department of Computing and the Graduate School at Goldsmiths, I joined the rest of the EAVI team in Seoul, South Korea, to attend ACM SIG CHI 2015. Seven of us went to CHI this year to present two papers one note, which disclosed EAVI’s research methods, approaches and outcomes in sonic interaction design, machine learning for muscle sensing interfaces, and technology development for supporting musical creation by users with various form of physical impairments (an overview here).
CHI 2015 was an impressive state of art of today’s research in HCI, covering a broad and expanding galaxy of practices and disciplines, reflected by the many diverse sessions running each days. The differences between the various approaches would find always a way to manifest themselves and generating interesting points of debates, positive tension and critical interventions, often in very simple ways.
In the Speech & Auditory Interfaces paper session on 23rd April, Baptiste Caramiaux and I presented the paper Form Follows Sound: Designing Interaction From Sonic Memories, which we wrote this paper with Atau Tanaka and Scott Pobiner (Parsons, New School For Design, New York. We presented a series of participatory Sonic Interaction Design workshops in which we explored how to generate scenarios for interaction with sound using embodied gestural interfaces and drawing upon participants’ memories of everyday sound and connected situations.
We then presented the method used in the workshops, consisting of an ideation phase followed by realisation of working prototypes, and concluded the presentation discussing the results.
The following is a very short video we prepared for the ACM digital library which shows the structure of the workshops and gives details about the Sonic Incident, the Gestural-Sound toolkit and finally presents the three embodied sonic interaction models (Conducting, Manipulating, Substituting) we discuss in the paper.
In the short q&a and at the end of session, members of the audience raised interesting questions on the applicability of the sonic incident technique for product sound design. Although there was no time during the presentation to show this, the paper describes an initial guideline that designers can follow step by step. The paper is available on the ACM Digital Library.
Here some personal highlights of the conference:
- “As light as your footsteps: Altering walking sounds to change perceived body weight, emotional state and gait”
- Affordance++: Allowing Objects to Communicate Dynamic Use
- “Proprioceptive interaction”
-
Understanding Gesture Expressivity through Muscle Sensing
- Empirical Evidence for a Diminished Sense of Agency in Speech Interfaces
- From User-Centered to Adoption-Centered Design: A Case Study of an HCI Research Innovation Becoming a Product
- Acoustruments: passive, acoustically-driven, interactive controls for handheld devices
- Cruise Control for Pedestrians: Controlling Walking Direction using Electrical Muscle Stimulation
- LeviPath: Modular Acoustic Levitation for 3D Path Visualisations
Last but not least, congratulations to goldsmiths BA computing student Pedro Kirk for winning chi2015 Student Research Competition! You can read his paper here.
Goodbye Seoul!