SATO Wataru Laboratory
An android for emotional interaction: Spatiotemporal validation of its facial expressions
(Sato, Namba, Yang, Nishida, Ishi, & Minato: Front Psychol)
Android robots capable of emotional interactions with humans have considerable potential for application to research.
While several studies developed androids that can exhibit human-like emotional facial expressions, few have empirically validated androidsf facial expressions.
To investigate this issue, we developed an android head called Nikola based on human psychology and conducted three studies to test the validity of its facial expressions.
In Study 1, Nikola produced single facial actions, which were evaluated in accordance with the Facial Action Coding System.
The results showed that 17 action units were appropriately produced.
In Study 2, Nikola produced the prototypical facial expressions for six basic emotions (anger, disgust, fear, happiness, sadness, and surprise), and na?ve participants labeled photographs of the expressions.
The recognition accuracy of all emotions was higher than chance level.
In Study 3, Nikola produced dynamic facial expressions for six basic emotions at four different speeds, and na?ve participants evaluated the naturalness of the speed of each expression.
The effect of speed differed across emotions, as in previous studies of human expressions.
These data validate the spatial and temporal patterns of Nikolafs emotional facial expressions, and suggest that it may be useful for future psychological studies and real-life applications.