Posts Tagged HRI
BGU is seeking for excellent candidates for senior or junior faculty positions in the Dept. of Industrial Engineering and Management. Candidates will be part of the Human Factors engineering team.
Relevant topics are: HCI, HRI, Usability, HFE, or any affiliated fields.
For more information please contact: Prof. Tal Oron-Gilad at firstname.lastname@example.org
Ben-Gurion University of the Negev
Department of Brain and Cognitive Sciences
Department of Industrial Engineering and Management
Department of Computer Science
ABC (Agricultural, Biological and Cognitive) Robotics Center
Doctoral/Post-doctoral Position in
Promoting intent and context based interaction and collaboration of humans and robots will be of high importance in the near future, when ‘things’ around us will have more intelligence.
We are looking for a highly motivated PhD student or post-doctoral fellow to lead a research aimed at promoting intent and context based interaction and collaboration of humans and robots. Such interaction requires the development of novel intent based interfaces (e.g., brain computer interaction) and the generation of a shared (e.g., via augmented reality) mental model for both human and robot. It also requires investigation of how collaboration is built over time and how context may affect it. The current effort takes a multidisciplinary perspective of human-robot relations and focuses on integration of multiple paradigms.
We are looking for candidates with a strong computational background interested in setting up and leading new and exciting research directions.
Closing date for applications: 30 May 2014 or until all positions are filled.
Candidates applying by above closing date will be informed by July 2014.
Starting date: 1 October 2014 or earlier
For more information, please contact:
Prof. Tal Oron-Gilad, Human Factors Engineering – email@example.com
Dr. Oren Shriki, Cognitive and Brain Sciences – firstname.lastname@example.org
Dr. Idit Shalev, Cognitive and Brain Sciences – email@example.com
Dr. Jihad El-Sana, Computer Science and augmented reality – firstname.lastname@example.org
For those of you who are interested in the role of Human-Robot Interaction (HRI) in future military operations, Mike Barnes and Florian Jencth have recently edited a handbook titled “Human-Robot Interactions in Future Military Operations“. The book is a collection of chapters written by well recognized researchers in the area. It provides a wide range of topics from operators interacting with small ground robots and aerial vehicles to supervising large, near-autonomous vehicles capable of intelligent battlefield behaviors.
I was honored to contribute a chapter to this book. Together with my colleague and former student Yaniv Minkov we discuss the issue of “Remotely Operated Vehicles (ROVs) from the bottom-up operational perspective“.
Here is the abstract of one of my latest studies. It appears in a special issue of JCEDM “Improving Human-Robot Interaction in Complex Operational Environments: Translating Theory into Practice”
* Oron-Gilad, T., Redden, E.S. and Minkov, Y. (2011). Robotic Displays for Dismounted Warfighter Situation Awareness of Remote Locations: A field study, Journal of Cognitive Ergonomics and Decision Making. Accepted November 2010.Volume 5, Number 1, March 2011, pp. 29–54.
This study investigated scalability of unmanned vehicle displays for dismounted warfighters. Task performance, workload and preferences for three display devices were examined in two operational settings: tele-operation of an unmanned ground vehicle and intelligence gathering from a remote unmanned vehicle. Previous research has demonstrated variability in operational needs with regard to active tele-operation versus passive intelligence gathering. Thus, it was important to identify whether there was actually a dichotomy between the two in terms of screen space requirements and whether this difference stems from task differences or other factors. Thirty-one soldiers participated in a field study at Ft. Benning, GA. They were required to perform tele-operation and intelligence gathering tasks. Results reconfirmed our hypothesis that display type influences performance in intelligence-related tasks that require the use of video feed and digital map. No significant differences among display types were found in the UGV tele-operation task. In conclusion, dismounted warfighters can adequately perform both active and passive duties with a hand held device where the video window is as small as 4.3 inches in diameter. However, monocular HMDs for robotic displays can be problematic and should be carefully assessed before use in dismounted warfighters missions.