by the guest editors Serena Ivaldi (Inria) and Maria Pateraki (ICS-FORTH)

This special theme addresses the state of the art of human-robot interaction (HRI), discussing the current challenges faced by the research community for integrating both physical and social interaction skills into current and future collaborative robots.
Recent years have seen a proliferation of applications for robots interacting physically with humans in manufacturing and industry, from bimanual cooperation in assembly with cobots (i.e., industrial manipulators for collaboration) to physical assistance with exoskeletons. These applications have driven research in many fundamental topics for collaboration, such as shared task allocation, synchronisation and coordination, control of contacts and physical interaction, role estimation and adaptive role allocation during collaboration, learning by demonstrations, safe control, etc. All the developments in these areas contribute to the success of the “Industry 4.0”, whose elite platforms are essentially cobots and exoskeletons.

by Luca Buoncompagni, Alessio Capitanelli, Alessandro Carfì, Fulvio Mastrogiovanni (University of Genoa)

The introduction of collaborative robots in next-generation factories is expected to spark debates about ethical, social, and even legal matters. Research centres, universities and manufacturing companies, as well as technology providers, must help society understand how it can benefit from this transition, and possibly accelerate it. We frame the problem in the context of the trade-off between Artificial Intelligence and Intelligence Augmentation, and we pose four questions that research in human-robot cooperation must address.

by Serena Ivaldi (Inria)

Collaborative robots need to safely control their physical interaction with humans. However, in order to provide physical assistance to humans, robots need to be able to predict their intent, future behaviour and movements. We are currently tackling these questions in our research within the European H2020 Project AnDy. [L1].

by Eneko Agirre (UPV/EHU), Sarah Marchand (Synapse Développement), Sophie Rosset (LIMSI), Anselmo Peñas (UNED) and Mark Cieliebak (ZHAW)

Dialogue systems are a crucial component when robots have to interact with humans in natural language. In order to improve these interactions over time, the system needs to be able to learn from its experience, its mistakes and the user’s feedback. This process – fittingly called lifelong learning – is the focus of LIHLITH, an EU project funded by CHIST-ERA.

by Alexander Schindler and Sven Schlarb (AIT Austrian Institute of Technolog)
Conversational systems allow us to interact with computational and robotic systems. Such approaches are often deliberately limited to the context of a given task. We apply audio analysis to either broaden or to adaptively set this context based on identified surrounding acoustic scenes or events.

by Gergely Horváth, Csaba Kardos, Zsolt Kemény, András Kovács, Balázs E. Pataki and József Váncza (MTA SZTAKI)

Human–Robot Collaboration (HRC) in production—especially in assembly— offers, on one hand, flexibility and a solution for maintaining competitiveness. On the other hand, there are still numerous challenges that have to be answered to allow the realization of HRC. Beyond the essential problems of safety, the efficient sharing of work and workspace between human and robot requires new interfaces for communication as well. As a part of the SYMBIO-TIC H2020 project, a dynamic context-aware and bi-directional, multi-modal communication system is introduced and implemented for supporting human operators in collaborative assembly.

by Marcus Kaiser (IMK-Automotive)

The planning of assembly workplaces with direct human-robot collaboration (HRC) is a complex task owing to the variety of target criteria that must be considered. The lack of a digital simulation tool for the wholistic planning and safeguarding of HRC-scenarios, as well as a lack of adequate training and qualification concepts for companies, are currently inhibiting the implementation of HRC. We are developing a new way to digitally design collaborative assembly systems to help companies embrace HRC systems.

by Amedeo Cesta, Gabriella Cortellessa, Andrea Orlandini and Alessandro Umbrico (ISTI-CNR)

Effective human-robot interaction in real-world environments requires robotic agents to be endowed with advanced cognitive features and more flexible behaviours with respect to classical robot programming approach. Artificial intelligence can play a key role enabling suitable reasoning abilities and adaptable solutions. This article presents a reseach initiative that pursues a hybrid control approach by integrating semantic technologies with automated planning and execution techniques. The main objective is to allow a generic assistive robotic agent (for elderly people) to dynamically infer knowledge about the status of a user and the environment, and provide personalised supporting actions accordingly.

by Koen V. Hindriks (Delft University of Technology), Roel Boumans (Delft University of Technology and  Radboud university medical center), Fokke van Meulen (Radboud university medical center), Mark Neerincx (Delft University of Technology), Marcel Olde Rikkert (Radboud university medical center)

We are designing a social robot to collect patient data in hospitals by interviewing patients. This task is crucial for improving and providing value-based care. Currently, professional caretakers administer self-reported outcome questionnaires called patient reported outcome measures (PROMs) to collect this data. By delegating this task to a robot, time spent on administration is significantly reduced.

by Eleni Efthimiou and Stavroula-Evita Fotinea (Athena RC)

ComBox incorporates a multimodal user-centred intelligent human-robot interaction (HRI) framework that uses different technologies and user modalities to create à-la-carte HRI solutions. Appropriate HRI approaches are likely to encourage user trust and acceptance of assistive robotic devices.

by Elef Schellen, Jairo Pérez-Osorio and Agnieszka Wykowska (Istituto Italiano di Tecnologia)

The Social Cognition in Human-Robot Interaction (S4HRI) research line at the Istituto Italiano di Tecnologia (IIT) applies methods from experimental psychology and cognitive neuroscience to human-robot interaction studies. With this approach, we maintain excellent experimental control, without losing ecological validity and generalisability, and thus we can provide reliable results informing about robot design that best evokes mechanisms of social cognition in the human interaction partner.
A major goal in the field of Human-Robot Interaction (HRI) is determining the factors required for social attunement between a human and a robot agent. When socially attuned with others, humans employ specialised cognitive mechanisms leading to effective communication and cooperation. Eliciting attunement in interaction with artificial agents (robots, in this case) will therefore allow these mechanisms to be brought to bear on HRI, improving cooperation between humans and robots. 

by Vanessa Evers (University of Twente)

Since 2011, The Human Media Interaction Group at the University of Twente has been working on robots with social intelligence. This has led to the development of robots that can recognise human behaviour, interpret this behaviour and respond in a socially appropriate way. We have developed robots that can be used as guides at zoos or airports, and helping children with autism in understanding emotional expressions in faces.
Work started with the European FP7 project FROG [L1], the Fun Robotic Outdoor Guide. The FROG robot was an instantiation of a robot service in outdoor public places. We envisioned robotic information or other services in outdoor public places such as city squares, car parks at shopping malls and airports and leisure areas such as parks and zoos. The FROG robot was developed specifically to offer augmented reality information in places such as zoos or cultural heritage sites such as the Royal Alcazar in Seville, Spain.

by Parmenion Mokios and Michail Maniadakis (ICS-FORTH)

Synergetic performance within human-robot teams might be significantly enhanced by consideration of the temporal aspects of multi-agent interaction. For a number of years, FORTH has been equipping  robots with human-like artificial time perception thus contributing a unique robotic cognitive skill that drastically improves fluency in human-robot interaction (HRI). We present an overview of the relevant technologies, which are constantly being improved and tested in naturalistic multi-agent scenarios.

by Patrizia Ribino and Carmelo Lodato (ICAR-CNR)

Human interactions are fundamentally based on normative principles. Particularly in social contexts, human behaviours are affected by social norms. Individuals expect certain behaviours from other people, who are perceived to have an obligation to act according to the expected behaviour. Giving robots the ability to interact with humans, on human terms,  is an open challenge. People are more willing to accept robotic systems in daily life when the robots engage in socially desirable behaviours with benevolent interaction styles. Furthermore, allowing robots to reason in social situations, which involve a set of social norms generating expectations, may improve the dynamics of human-robot interactions and the self-evaluation processes of robot’s behaviours.

Next issue: July 2021
Special theme:
"Privacy-Preserving Computation"
Call for the next issue
Image ERCIM News 114 epub
This issue in ePub format

Get the latest issue to your desktop
RSS Feed