Thursday, September 22, 2011

Paper Reading #10 : Sensing foot gestures from the pocket

References:
Sensing foot gestures from the pocket by Jeremy Scott, David Dearman, Koji Yatani, and Khai N. Truong. 


Published in the UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology.
Author Bio:
Jeremy Scott studied Pharmacy and Toxicology at the University of Western Ontario.
David Dearman is currently in the process of getting a PhD in Computer Science from the University of Toranto.
Koji Yatani is a CS PhD student at the University of Toranto focusing on CHI.
Khai Truong is an assistant professor in computer science at the University of Toronto.


Summary:
Hypothesis:
We can use foot motion as a form of data input to a computer.


Methods:
Cameras were used to capture the various movements of subjects feet. Gestures tested included
  • Dorsiflexion: four targets placed between 10° and 40° inclusive
  • Plantar flexion: six targets placed between 10° and 60° inclusive
  • Heel & toe rotation: 21 targets (each), with 9 internal rotation targets placed between -10° and -90° inclusive, and 12 external rotation targets placed between 10° and 120° inclusive
Participants used a mouse to respond to prompts and to indicate the beginning and end of the foot motion

Results:
The side of the foot had the greatest accuracy, followed by the front and finally the back. Targets located at the center of the foots range of motion were able to be selected much more quickly than those at the edge and users seemed to prefer motions involving the heel.

Contents:
This used subjects to attempt to study ways that we can use gesture recognition with feet. This was then ported to a mobile app that the user could then carry around with them on a mobile device. One of the bigger issues was that it was difficult to distinguish between motions that were ment to be made and those made on accident.

Discussion:
This was a pretty neat article. We have ready a lot about gesture recognition but I personally would have never thought to apply it to feet. I think this could have neat application for people who are handicapped or without arms or hands to preform gestures. Using foot recognition would be a great way to be able to have these users still be able to interact well with the various gesture related devices. I would be interested to see how this information is applied in the future.

No comments:

Post a Comment