Comparing an android head with its digital twin regarding the dynamic expression of emotions
We are happy to share that our Master student Amelie Kassner got her recent work on the dynamic expression of emotions of androids and their digital twins accepted at the Affective Human-Robot Interaction workshop (AHRI) which is part of the Affective Computing + Intelligent Interaction Conference (ACII). The paper will be presented at the MIT Media Lab in Cambridge USA, 10th September 2023.
Emotions, which are an important component of social interaction, can be studied with the help of android robots and their appearance, which is as similar to humans as possible. The production and customization of android robots is expensive and time-consuming, so it may be practical to use a digital replica. In order to investigate whether there are any perceptual differences in terms of emotions based on the difference in appearance, a robot head was digitally replicated. In an experiment, the basic emotions evaluated in a preliminary study were compared in three conditions and then statistically analyzed. It was found that apart from fear, all emotions were recognized on the real robot head. The digital head with “ideal” emotions performed better than the real head apart from the anger representation, which offers optimization potential for the real head. Contrary to expectations, significant differences between the real and the replicated head with the same emotions could only be found in the representation of surprise.
Authors: Amelie Kassner, Christian Becker-Asano