One of the objectives of the SPRING project is to develop and implement methodologies enabling the robot to automatically:
1. Explore the environment;
2. Move towards one or several persons;
3. Attract the attention of the selected persons in order to facilitate face-to-face communication;
4. Multi-party conversation management.
To oversee this, we developed the robot non-verbal behaviour system, which is responsible for deciding the optimal non-verbal actions to take, based on the dialogue state and the overall plan. In a newly published Deliverable, we present the final software design and implementation of the robot non-verbal behaviour system for the target environment. This includes the interface between the non-verbal behaviour manager, the task planner, and the conversational system.

ARI non-verbal behaviour system

The robot non-verbal behaviour system of SPRING allows to synthesise the learnt robot behaviour and to choose the appropriate non-verbal actions for the robot to take during interactions. The Robot Non-verbal Behaviour System consists of 2 modules:

  •  The non-verbal behaviour manager, which interfaces with the high-level planner and conversational system to choose appropriate actions for the robot to take to manage the interactions.
  • The robot behaviour generation module, which interfaces with the non-verbal behaviour manager to synthesise the optimal robot behaviour and controls the execution of the robot (non-verbal) actions during interactions.

In Deliverable D6.6 (access link)  is described the overall architecture for the robot non-verbal behaviour system; the robot
behaviour generation module; and the non-verbal behaviour manager. The software will be released in the git code repositories for WP5 (link) and WP 6 (link).