Posts Tagged Gesture vocabulary

Multimodal communication for guiding a person following robot

Come meet us at Ro-Man 2017, where Dr, Vardit Sarne-Fleischmann and Shanee Honig will present our work on Gesture vocabulary for a person following robot.

Abstract— Robots that are designed to support people in different tasks at home and in public areas need to be able to recognize user’s intentions and operate accordingly. To date, research has been mostly concentrated on developing the technological capabilities of the robot and the mechanism of recognition. Still, little is known about navigational commands that could be intuitively communicated by people in order control a robot’s movement. A two-part exploratory study was conducted in order to evaluate how people naturally guide the motion of a robot and whether an existing gesture vocabulary used for human-human communication can be applied to human-robot interaction. Fourteen participants were first asked to demonstrate ten different navigational commands while interacting with a Pioneer robot using a WoZ technique. In the second part of the study participants were asked to identify eight predefined commands from the U.S. Army vocabulary. Results show that simple commands yielded higher consistency among participants regarding the commands they demonstrated. Also, voice commands were more frequent than using gestures, though a combination of both was sometimes more dominant for certain commands. In the second part, an inconsistency of identification rates for opposite commands was observed. The results of this study could serve as a baseline for future developed commands vocabulary promoting a more natural and intuitive human-robot interaction style.

link to our poster.


, , ,

Leave a comment

Utilizing Hand Gesture Interaction in Standard PC-based Interfaces

  • This work was conducted by my former graduate student Jenny Grinberg. It focused on how a gesture vocabulary should be applied when gestures are being used in standard window interfaces (Windows, files and folders). We are currently in process of writing up the publication.
  • Interface technologies have only started to adopt hand gestures and most human-computer controls still require physical devices such as keyboard or mouse.
  • To evaluate the influence of keyboard interaction, gestures and combined interaction on user experience an existing hand gesture recognition system (developed by Stern & Efros, 2005) was integrated into a common Windows environment.
  • Two experiments varied in the way the Gesture Vocabulary (GV) was introduced; bulk (Experiment 1) or gradual learning (Experiment 2).
  • Results indicated that all gestures used in the GV were simple and could be executed within a relatively short learning period.
  • Nevertheless, keyboard interaction remained the most efficient, least demanding, and most preferred way.
  • Performance and subjective ratings of gestures and combined interaction were significantly different from those of the keyboard, but not from each other.

Interesting differences among genders emerged:

  • Combined interaction was preferred over gestures-alone among women.
  • With regard to the GV introduction, experiment one revealed that performance time and error rate with gestures were significantly higher for females than for males. However, gradual introduction of gestures (experiment two) improved females’ subjective satisfaction, decreased their performance time, and did not worsen error rate. For males, no such differences were found.
  • Men and women related differently to the gesture displays and women perceived textual labels as more useful.

Here is a screen shot of the application consisting of a standard window which enables to perform the most commonly used commands with folders and files (e.g., open a folder, move the cursor to the right folder, etc.) via hand gestures or via keyboard. To the right is the gesture feedback window (which is part of the gesture recognition system developed by Stern & Efros, 2005).


  • To the right, the visual display as captured by the gesture recognition camera
  • To the left, the main task window containing files in folders
  • at the bottom of the screen are various parameters regarding the hand’s position and a label with the name of the current command

Gesture Vocabulary (GV) design.

Nine dynamic gestures were defined with one of them as the start/end position. The other eight represented the most commonly used commands in file management navigation processes; right, left, up and down, entering and exiting a folder, and copy/paste commands.


Here is a video demo of the various gestures used.

Gesture Vocabulary demo


Initial findings were reported in Grinberg J. and Oron-Gilad T., Utilizing Hand-Gesture Interaction in Standard PC Based Interfaces, proceeding of the  International Ergonomica Association IEA 2009, Bejing, China.