Sunday, October 23, 2011

Reading #23: User-defined Motion Gestures for Mobile Interaction

References
User-defined Motion Gestures for Mobile Interaction by Jaime Ruiz, Yang Li, Edward Lank.  

Published in the CHI '11 Proceedings of the 2011 annual conference on Human factors in computing systems.


Author Bios:
Jamie Ruis is a doctoral student at the University of Waterloo.
Yang Li is a research scientist at Google.
Edward Land is an assistant professor at the University of Waterloo.


Summary:
Hypothesis:
That certain sets of gestures will be better for a user than others.


Methods:
The subjects were asked to design their own set of gestures to accomplish tasks. The researchers then analyzed these gestures and several were selected. These were then given to the subjects and they were given a set of tasks to perform with them. The results were recorded.


Results:
Many of the subjects designed gestures that were the same, or at least very similar, for the same task. These gestures usually mimicked some real life motion associated with the task. Opposite tasks were performed in opposite directions, etc. 


Contents:
The paper starts by having the subjects design gestures to complete tasks. Several of these are then selected by the researchers. They then return these gestures to the subjects and ask them to perform tasks with them, recording the results.


Discussion:
This paper is basically why i am taking this class. Mobile devices are the current big thing and things like this will greatly increase our productivity with them. Being able to create more user friendly and intuitive gestures will help to unveil many new uses for mobile devices and help streamline those uses that we already have for them.

No comments:

Post a Comment