The 1st Workshop on “Behavior, Emotion and Representation: Building Blocks of Interaction” will be held in Bielefeld (Germany) in October 17, 2017. This Workshop is a satellite event of the HAI 2017 conference. This full day workshop is at the crossroad of Human perception, Affective Computing, Psychology and Cognitive Representation research domains. The first objective of this workshop is to stimulate exchange from a multidisciplinary point of view between these research communities on the specific topic of underlying tools and models for new building blocks for natural and intuitive interaction. The second objective is to discuss the cognitive abilities and physiological parameters of users to provide assistance in human-machine interactions.
Natural behavior skills based on cognitive abilities are key challenges for robots, virtual agents and intelligent machines while interacting with humans. This becomes particular evident by the expected increase in the use of intelligent interaction partners designed to support humans in everyday situations within the next coming decade (such as virtual coaches, companion robots, assistive systems and autonomous cars). These systems will need to develop their autonomy and they have to elicit social interaction and social synchrony. In order to achieve these goals, their perception of humans, as well as their behavior, must build on more complex inputs about emotion, mental state and models of the human partners compared to the mainly more low-level based approaches currently in use. Recent advances in multidisciplinary research on behavior, emotional states, visual behavior, neurofeedback, physiological parameters or mental memory representations help to understand the cognitive background of action and interaction in everyday interactions and therefore to pave the way for the design of new building blocks for a more natural and intuitive human-machine interaction.
Collecting and analyzing multi-modal data from different measurements also allows constructing solid computational models. These blocks of interaction will serve as basis for building artificial cognitive systems being able to interact with humans in an intuitive way and to acquire new skills by learning from the user. This will result in new forms of human-computer interaction such as individualized, adaptive assistance systems for scaffolding cognitive, emotional and attentive learning processes. In this context, it is clearly advantageous for intelligent robots and virtual agents to know how these cognitive representations and physiological parameters are formed, stabilized and adapted during different phases in daily actions. This knowledge enables a technical system to specify and perceive individual’s current level of learning and performance, and therefore to shape the interaction. These interactions must be (socially) appropriate, not excessive. Such systems can assist users in developing (interaction) skills in a variety of domains and during different phases in daily-life actions. At the same time, interactive systems should fulfill constraints such as usability, acceptability and ethics.