Posts Tagged follow me
Eldercare will change
Posted by Tal Oron-Gilad in HRI, Human-Robot Interaction, News, Older adults, robotics on April 12, 2020
Here is a link to a short video summary of our work for the SOCRATES EU project. The overarching focus of this project is on Robotics in eldercare. The use-cases have become extremely relevant with the coronavirus outbreak. We often tended to assume that the lack of sufficient professional personnel will be the main reason for implementing and distributing social robots for the older population. Now we see the necessity of robots for maintaining the safety of older adults and avoiding the spread of disease – virus among those who are more vulnerable.
In SOCRATES we (Samuel Olatunji our doctoral student, Yael Edan my colleague and myself) look at the necessary balance between the robot’s level of autonomy (LOA) and the amount and pace of information it should provide (LOT – level of transparency) – so that people will get just the right amount of feedback from the robot (too much may distract them, too little may cause confusion, distrust, and abandonment fo this technology).
Our participants are active older adults who were willing to come to the lab and help us in developing our algorithms and applications. We wish them all well and to stay healthy. We hope to see them all again in the lab when the time comes and it is possible again.
The robot that you see in the film is not teleoperated, it moves autonomously following the user’s path and pace. This is the YouTube link: https://youtu.be/3ruDAcTzPIg
To read more about this work and about Samuel
IEEE RO-MAN 2016 presentations
Posted by Tal Oron-Gilad in HRI, News, robotics on May 28, 2016
Two of our works have been accepted as full papers for presentation and publication in the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2016).
“Postures of a Robot Arm – window to robot intentions?” authored by my doctoral student Sridatta Chaterjee and co-authored by my colleagues Drs. Oren Shriki and Idit Shalev.
Abstract— Body language of robot arms, have rarely been explored as a medium of conveying robot intentions. An exploratory study was done focusing on two questions: one, if robot arm postures can convey robot intentions, and two, if participants coming in contact with this robot arm for the first time can associate any meaning to the postures without watching the robot in action, or working with it. Thirty five participants of a wide age range (25-70) took part in this exploratory study. Results show that participants could interpret some postures. Four distinct types of postures have been selected to four separate categories by the majority of participants irrespective of their age. In addition, postures selected in categories like, ‘Robot giving object in a friendly manner’; ‘Robot is saying Hi!’, ‘Robot has been told not to disturb’ show similarity to body language exhibited by humans and animals while communicating such messages.

Posture 8, what is the robot doing?
“The Influence of Following Angle on Performance Metrics of a Human-Following Robot” co-authored by our graduate students Shanee Honig and Dror Katz, and my colleague Prof. Yael Edan.
Abstract— Robots that operate alongside people need to be able to move in socially acceptable ways. As a step toward this goal, we study how and under which circumstances the angle at which a robot follows a person may affect the human experience and robot tracking performance. In this paper, we aimed to assess three following angles (0◦ angle, 30◦ angle, and 60◦ angle) under two conditions: when the robot was carrying a valuable personal item or not. Objective and subjective indicators of the quality of following and participants’ perceptions and preferences were collected. Results indicated that the personal item manipulation increased awareness to the quality of the following and the following angles. Without the manipulation, participants were indifferent to the behavior of the robot. Our following algorithm was successful for tracking at a 0◦ and 30◦ angle, yet it must be improved for wider angles. Further research is required to obtain better understanding of following angle preferences for varying environment and task conditions.

Following angles of a person-following robot: straight from behind or wider angles?
NY, Looking forward to two great presentations!
Following Angle of a Human-Following Robot
Posted by Tal Oron-Gilad in HRI, News, robotics on April 26, 2016
Human-following capabilities of robots may become important in assistive robotic applications to facilitate many daily tasks (e.g. carrying personal items or groceries). Robot’s following distance, following angle and acceleration influence the quality of the interaction between the human and the robot by impacting walking efficiency (e.g., pace, flow and unwanted stops), user comfort and robot likability.
Our team gave a presentation at the ICR 2016 conference focusing on Subjective preferences regarding human-following robots: preliminary evidence from laboratory experiments.

Following Angles of a human-following Pioneer LX Robot (Honig, Katz, Edan & Oron-Gilad)
- This research effort is led by our graduate student Shanee Honig
- For the person-tracking and following algorithm (Dror Katz & Yael Edan, work in progress) we use the Pioneer LX Robot’s built in camera and a Microsoft Kinect.
- Currently we focus on 3 angles of following: back following (0 degree angle), a 30 degree angle, and a 60 degree angle.
- We use a personal item manipulation (e.g., wallet) to examine how participants engage with the robot. Naturally when participants place a personal item on the robot, they become more engaged with it.
- Come see us at the HCII 2016 where we will present a poster on sensitivity of older users (68 and above) to the quality of interaction, depending on robot’s following distance and acceleration, and the context of walk – Follow Me: Proxemics and Responsiveness Preferences of Older Users in a Human-Following Robot.