Thursday, November 10, 2011

Reading #28: Experimental Analysis of Touch-Screen Gesture Designs in Mobile Environments

References:
Experimental Analysis of Touch-Screen Gesture Designs in Mobile Environments by Andrew Bragdon, Eugene Nelson, Yang Li and Ken Hinckley


Paper presented at CHI 2011.


Author Bios:
Andrew Bragdon is a PhD computer science student at Brown University.
Eugene Nelson is a professor at Brown University.
Yang Li is a senior researcher at Google Research.
Ken Hinckley is a principal researcher at Microsoft Research.



Summary:
Hypothesis:
That gesture techniques used during various distracting activities such as walking can be more effective than the current soft-button paradigm. 


Methods:
The first tests they did were with activating gesture mode on the mobile device. They tried a series of methods such as a rocker button on the side of the phone and a large soft button placed on the screen. Users found both of these difficult to do while looking away from the phone. The final solution was to mount a hard button at the top of the phone. The distinct feel and location of the button made it easy for the users to locate and press without looking down. This button was needed only for free drawn gestures. Bezel gestures activate gesture mode when the user begins a gesture at the edge of a screen.  They also tested different kinds of gestures, paths and marks. They tested these different gestures they had users interact with a device while doing various activities and under different levels of distraction. They tested sitting and walking with moderate distraction levels, and sitting with attention-saturating tasks.


Results:
The surprising result in this paper was the users not only preferred gestures when looking away from a phone, but half the users also preferred them when using the phone normally. Specifically when looking away from the phone the users preferred bezel gestures, citing one of the reasons as the elimination of the button which they saw an "an extra step." Under distracting conditions bezel gestures significantly out preformed soft buttons.


Contents:
The researchers began by testing the various gesture methods in a controlled environment. They came to the conclusion that users preferred bezel gestures as their gesture method. They next tested the users ability to preform tasks while looking away from the phone, under distracting conditions and while walking and sitting. Bezel gestures were widely preferred in this instance. In the attention-saturating test users were asked to sit and preform a task with the device while being actively distracted from that task. Bezel gestures, once again, out performed soft buttons.


Discussion:
This paper was really pretty interesting. Mobile devices are designed to be used on the move, so it makes sense to research methods to do this more effectively. Being able to do tasks on a mobile device while not having to devote your full attention to it could greatly increase a users productivity. 

No comments:

Post a Comment