Archive for category Human-Robot Interaction
Not very many posts in 2019 but this does not mean that we have not conducted some really interesting research in our lab. On the contrary
So, over the next few weeks I will begin posting some of our most recent accomplishments.
Here is just one:
Closing the feedback loop – the relationship between input and output modalities in HRI, presentation at the Human Friendly Robotics workshop in Rome 2019
Here is a new publication from our lab. This is a literature review that is focused on person-following in robotics from the perspective of the user. Published in IEEE THMS.
Come meet us at Ro-Man 2017, where Dr, Vardit Sarne-Fleischmann and Shanee Honig will present our work on Gesture vocabulary for a person following robot.
Abstract— Robots that are designed to support people in different tasks at home and in public areas need to be able to recognize user’s intentions and operate accordingly. To date, research has been mostly concentrated on developing the technological capabilities of the robot and the mechanism of recognition. Still, little is known about navigational commands that could be intuitively communicated by people in order control a robot’s movement. A two-part exploratory study was conducted in order to evaluate how people naturally guide the motion of a robot and whether an existing gesture vocabulary used for human-human communication can be applied to human-robot interaction. Fourteen participants were first asked to demonstrate ten different navigational commands while interacting with a Pioneer robot using a WoZ technique. In the second part of the study participants were asked to identify eight predefined commands from the U.S. Army vocabulary. Results show that simple commands yielded higher consistency among participants regarding the commands they demonstrated. Also, voice commands were more frequent than using gestures, though a combination of both was sometimes more dominant for certain commands. In the second part, an inconsistency of identification rates for opposite commands was observed. The results of this study could serve as a baseline for future developed commands vocabulary promoting a more natural and intuitive human-robot interaction style.