Track C: Intelligent Interaction
The aim is to develop innovative and effective robot behaviours, understand and interpret human user behaviours and to study and design human robot interaction in actual contexts of use. During the duration of the project you will be guided by the Social Robotics group of HMI.
A social touch library A social touch consists of many different parameters. They come in many different forms (e.g., a hug, a fist-bump, a slap, or a stroke), they can be rough or subtle, and the body part that makes the touch (oftentimes the hand) can feel warm or cold for instance. When applied carefully, technology makes it possible to simulate a human stroke and to manipulate for instance the speed, intensity and/or temperature. Earlier investigations suggest that people perceive different variations of such strokes differently in terms of for instance pleasantness and emotional associations. We are planning to have a library of different ‘simulated social touches’ that carry for instance specific emotional meanings, pleasantness ratings, etc. The idea is that when we require a specific type of touch (e.g., ‘a stroke that is capable of calming people’) for research purposes, we can find a suitable one in the ‘touch library’. However, to be able to do so, a thorough evaluation of several types of simulated touches is required. Along with that, an investigation of suitable measures (i.e., evaluation criteria) for this purpose is necessary, to come to a coherent and all-encompassing evaluation. There are different starting points within this ‘touch library’ project and we can discuss whether the emphasis will be on the evaluation of the touch parameters or on the evaluation of the measures.
The attribution of a social touch As described, a social touch between humans can have several social effects, such as for instance the willingness to comply to a request. Initial investigations in the field of mediated touch suggest that a touch over a distance can result in similar effects. However, it is rather unclear whether these results are caused by the actual physical stimulation, or by the attribution of the touch to another person (i.e., having the idea that someone else touched you over a distance, rather than just ‘feeling something’). The aim of this specific project is to gain insight in the question whether people perceive a simulated touch differently when they attribute it to another person rather than to a computer. Whether this evaluation of a touch focuses on the emotional perception of the touch, or on for instance the pro-social behavior, is up to the student.
Automatic recognition of social touch gestures In order to make machines such as robots behave socially intelligent in their interaction with humans there is a need for the system to understand the social meaning of touch. The aim of this project is to use machine learning techniques in order to classify data from a corpus of social touch. The dataset consists of the pressure sensor data from 31 subjects who performed 14 different social touch gestures such as grab, hit, stroke and tickle. In this project you can explore different features and classification techniques to differentiate between touch gesture classes. Some experience in machine learning techniques could be beneficial for this specific project.