Archive for category robotics

Understanding and Resolving Failures in Human-Robot Interaction

Shanee Honig and I have just finished a literature review on resolving failures in HRI.  The Full publication can be found in Frontiers .

We mapped a taxonomy of failures, separating technical failures from interaction failures [see 1].

1

A human-robot failure taxonomy

After reviewing the cognitive considerations that influence people’s ability to detect and solve robot failures, as well as the literature in failure handling in human-robot interactions, we developed an information processing model called the Robot Failure Human Information Processing (RF-HIP) Model, modeled after Wogalter’s C-HIP (an elaboration of Shannon & Weavers 1948 model of communication), to describe the way people perceive, process, and act on failures in human robot interactions.

  • RF-HIP can be used as a tool to systematize the assessment process involved in determining why a particular approach to handling failure is successful or unsuccessful in order to facilitate better design.

 

1

The RF-HIP (robotic failure – human information processing) Model

abstract

While substantial effort has been invested in making robots more reliable, experience demonstrates that robots operating in unstructured environments are often challenged by frequent failures. Despite this, robots have not yet reached a level of design that allows effective management of faulty or unexpected behavior by untrained users. To understand why this may be the case, an in-depth literature review was done to explore when people perceive and resolve robot failures, how robots communicate failure, how failures influence people’s perceptions and feelings towards robots, and how these effects can be mitigated. 52 studies were identified relating to communicating failures and their causes, the influence of failures on human-robot interaction, and mitigating failures. Since little research has been done on these topics within the Human-Robot Interaction (HRI) community, insights from the fields of human computer interaction (HCI), human factors engineering, cognitive engineering and experimental psychology are presented and discussed. Based on the literature, we developed a model of information processing for robotic failures (Robot Failure Human Information Processing (RF-HIP)), that guides the discussion of our findings. The model describes the way people perceive, process, and act on failures in human robot interaction. The model includes three main parts: (1) communicating failures, (2) perception and comprehension of failures, and (3) solving failures. Each part contains several stages, all influenced by contextual considerations and mitigation strategies. Several gaps in the literature have become evident as a result of this evaluation. More focus has been given to technical failures than interaction failures. Few studies focused on human errors, on communicating failures, or the cognitive, psychological, and social determinants that impact the design of mitigation strategies. By providing the stages of human information processing, RF-HIP can be used as a tool to promote the development of user-centered failure-handling strategies for human-robot interactions.

 

, , ,

Leave a comment

Towards Socially Aware Person-Following Robots

Here is a new publication from our lab. This is a literature review that is focused on person-following in robotics from the perspective of the user. 1Published in IEEE THMS.

 

Abstract:

Significant R&D has been invested in technical issues related to person following. However, a systematic approach for designing robotic person-following behavior that maintains appropriate social conventions across contexts has not yet been developed. To understand why this may be the case, an in-depth literature review of 221 articles on person-following robots was performed, from which 107 are referenced. From these papers, six relevant topics were identified that shed light on the types of social interactions that have been studied in person-following scenarios: a) applications; b) robotic systems; c) environments; d) following strategies; e) human-robot communication; and f) evaluation methods. Gaps in the existing research on person-following robots were identified, mainly in addressing social interaction and user needs, noting that only 25 articles reported proper user studies. Human-related, robot-related, task-related, and environment-related factors that are likely to influence people’s spatial preferences and expectations of a robot’s person-following behavior are then discussed. To guide the design of socially aware person following robots, a user-needs layered design framework that combines the four factor categories is proposed. The framework provides a systematic way to incorporate social considerations in the design of person-following robots. Finally, framework limitations and future challenges in the field are presented and discussed.

Leave a comment

Multimodal communication for guiding a person following robot

Come meet us at Ro-Man 2017, where Dr, Vardit Sarne-Fleischmann and Shanee Honig will present our work on Gesture vocabulary for a person following robot.

Abstract— Robots that are designed to support people in different tasks at home and in public areas need to be able to recognize user’s intentions and operate accordingly. To date, research has been mostly concentrated on developing the technological capabilities of the robot and the mechanism of recognition. Still, little is known about navigational commands that could be intuitively communicated by people in order control a robot’s movement. A two-part exploratory study was conducted in order to evaluate how people naturally guide the motion of a robot and whether an existing gesture vocabulary used for human-human communication can be applied to human-robot interaction. Fourteen participants were first asked to demonstrate ten different navigational commands while interacting with a Pioneer robot using a WoZ technique. In the second part of the study participants were asked to identify eight predefined commands from the U.S. Army vocabulary. Results show that simple commands yielded higher consistency among participants regarding the commands they demonstrated. Also, voice commands were more frequent than using gestures, though a combination of both was sometimes more dominant for certain commands. In the second part, an inconsistency of identification rates for opposite commands was observed. The results of this study could serve as a baseline for future developed commands vocabulary promoting a more natural and intuitive human-robot interaction style.

link to our poster.

 

, , ,

Leave a comment

IEEE RO-MAN 2016 presentations

Two of our works have been accepted as full papers for presentation and publication in the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2016).

Postures of a Robot Arm – window to robot intentions?” authored by my doctoral student Sridatta Chaterjee and co-authored by my colleagues Drs. Oren Shriki and Idit Shalev.

Abstract— Body language of robot arms, have rarely been explored as a medium of conveying robot intentions. An exploratory study was done focusing on two questions: one, if robot arm postures can convey robot intentions, and two, if participants coming in contact with this robot arm for the first time can associate any meaning to the postures without watching the robot in action, or working with it. Thirty five participants of a wide age range (25-70) took part in this exploratory study. Results show that participants could interpret some postures. Four distinct types of postures have been selected to four separate categories by the majority of participants irrespective of their age. In addition, postures selected in categories like, ‘Robot giving object in a friendly manner’; ‘Robot is saying Hi!’, ‘Robot has been told not to disturb’ show similarity to body language exhibited by humans and animals while communicating such messages.

2016-05-28_10h34_06

Posture 8, what is the robot doing?

The Influence of Following Angle on Performance Metrics of a Human-Following Robot” co-authored by our graduate students Shanee Honig and Dror Katz, and my colleague Prof. Yael Edan.

Abstract— Robots that operate alongside people need to be able to move in socially acceptable ways. As a step toward this goal, we study how and under which circumstances the angle at which a robot follows a person may affect the human experience and robot tracking performance. In this paper, we aimed to assess three following angles (0◦ angle, 30◦ angle, and 60◦ angle) under two conditions: when the robot was carrying a valuable personal item or not. Objective and subjective indicators of the quality of following and participants’ perceptions and preferences were collected. Results indicated that the personal item manipulation increased awareness to the quality of the following and the following angles. Without the manipulation, participants were indifferent to the behavior of the robot. Our following algorithm was successful for tracking at a 0◦ and 30◦ angle, yet it must be improved for wider angles. Further research is required to obtain better understanding of following angle preferences for varying environment and task conditions.

2016-05-28_10h33_13

Following angles of a person-following robot: straight from behind or wider angles?

NY, Looking forward to two great presentations!


, , , , , ,

Leave a comment

Following Angle of a Human-Following Robot

Human-following capabilities of robots may become important in assistive robotic applications to facilitate many daily tasks (e.g. carrying personal items or groceries). Robot’s following distance, following angle and acceleration influence the quality of the interaction between the human and the robot by impacting walking efficiency (e.g., pace, flow and unwanted stops), user comfort and robot likability.

ICR Our team gave a presentation at the ICR 2016 conference focusing on Subjective preferences regarding human-following robots: preliminary evidence from laboratory experiments.

The Influence of Following Angle on Performance Metrics -

Following Angles of a human-following Pioneer LX Robot (Honig, Katz, Edan & Oron-Gilad)

  • This research effort is led by our graduate student Shanee Honig
  • For the person-tracking and following algorithm (Dror Katz & Yael Edan, work in progress) we use the Pioneer LX Robot’s built in camera and a Microsoft Kinect.
  • Currently we focus on 3 angles of following: back following (0 degree angle), a 30 degree angle, and a 60 degree angle.
  • We use a personal item manipulation (e.g., wallet) to examine how participants engage with the robot. Naturally when participants place a personal item on the robot, they become more engaged with it.
  • Come see us at the HCII 2016 where we will present a poster on sensitivity of older users (68 and above) to the quality of interaction, depending on robot’s following distance and acceleration, and the context of walk – Follow Me: Proxemics and Responsiveness Preferences of Older Users in a Human-Following Robot.

 

, , ,

Leave a comment

What do we think we are doing: principles of coupled self-regulation in human-robot interaction (…

The use of domestic service robots is becoming widespread. While in industrial settings robots are often used for specified tasks, the challenge in the case of robots put to domestic use is to affo…

via What do we think we are doing: principles of coupled self-regulation in human-robot interaction (….

,

Leave a comment

BGU is seeking PhD and postdoctoral students for advanced research in multidisciplinary robotics

ABC Robotics Center (Agricultural, Biological and Cognitive Robotics) at BGU is  seeking outstanding students for advanced research in multidisciplinary robotics

All applicants must be skilled in both oral and written communication in English and be able to work independently as well as in collaboration with others.

PhD applicants must have completed an MSc degree in Engineering, Natural Sciences, Computer Sciences or Psychology with a thesis. Experience in artificial intelligence, robotics, cognitive science and programming is an advantage. The application should include a CV, a list of academic grades, a copy of degree project report, a list of publications, three personal references (one from the MSc thesis advisor) and one A4 page describing the personal motivation for applying for this position. Ph.D. candidates must submit a research proposal and pass a qualification exam on their research proposal within the first year of the PhD studies. The PhD thesis should be completed within a 4-year timeframe. The ABC Robotics Ph.D. Scholarship covers tuition fees and a monthly stipend. The candidate will receive a minimum of 6,930 NIS per month for a duration of 4 years.

The ABC Robotics Postdoc Scholarship is 10,116 NIS per month for a duration of 2 years.

Additional requirements and details may be found at: http://in.bgu.ac.il/en/kreitman_school/Pages/admission.aspx

Applicants should send all necessary registration information to Ms. Sima Koram, email: simagel@exchange.bgu.ac.il as indicated in

http://aristo4bgu.bgu.ac.il/PhdEnglishApplication/PhdApplicationForm/

and send a copy of their application to: abc-robotics@bgu.ac.il

 ******     Specific research topics are proposed at: www.bgu.ac.il/abc-robotics

Closing date for applications: 30 May 2014 or until all positions are filled. Candidates applying by above closing date will be informed by July 2014.

Starting date: 1 October 2014 or earlier

, , , , , ,

Leave a comment