Professional Documents
Culture Documents
Human Robot Interaction An Introduction 08 Emotion
Human Robot Interaction An Introduction 08 Emotion
https://www.human-robot-interaction.org
Emotion
How are you feeling right now? Happy? Bored? A bit self-conscious?
Whatever the case may be, it’s unlikely that you are feeling absolutely
nothing. Various feeling states, and related emotions, are a key as-
pect of our day-to-day experience and of our interactions with other
people. Emotions can motivate and modulate behavior and are a neces-
sary component of human cognition and behavior. They can be spread
through vicarious experience, such as watching a tense movie, and di-
rect social interaction, such as seeing your best friend happy. Because
emotions are such an integral part of human social cognition, they
are also an important topic in human–robot interaction (HRI). Social
robots are often designed to interpret human emotion, to express emo-
tions, and at times, even to have some form of synthetic emotion driving
their behavior. Although emotions are not implemented in each and ev-
ery social robot, taking emotions into account in the design of a robot
can help improve the intuitiveness of the human–robot interaction.
This material has been published by Cambridge University Press as Human Robot Interaction by
Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic.
ISBN: 9781108735407 (http://www.cambridge.org/9781108735407).
This pre-publication version is free to view and download for personal use only. Not for re-distribution, re-sale or use in derivative works.
© copyright by Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic 2019.
https://www.human-robot-interaction.org
This material has been published by Cambridge University Press as Human Robot Interaction by
Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic.
ISBN: 9781108735407 (http://www.cambridge.org/9781108735407).
This pre-publication version is free to view and download for personal use only. Not for re-distribution, re-sale or use in derivative works.
© copyright by Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic 2019.
https://www.human-robot-interaction.org
116 Emotion
This material has been published by Cambridge University Press as Human Robot Interaction by
Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic.
ISBN: 9781108735407 (http://www.cambridge.org/9781108735407).
This pre-publication version is free to view and download for personal use only. Not for re-distribution, re-sale or use in derivative works.
© copyright by Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic 2019.
https://www.human-robot-interaction.org
This material has been published by Cambridge University Press as Human Robot Interaction by
Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic.
ISBN: 9781108735407 (http://www.cambridge.org/9781108735407).
This pre-publication version is free to view and download for personal use only. Not for re-distribution, re-sale or use in derivative works.
© copyright by Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic 2019.
https://www.human-robot-interaction.org
118 Emotion
This material has been published by Cambridge University Press as Human Robot Interaction by
Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic.
ISBN: 9781108735407 (http://www.cambridge.org/9781108735407).
This pre-publication version is free to view and download for personal use only. Not for re-distribution, re-sale or use in derivative works.
© copyright by Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic 2019.
https://www.human-robot-interaction.org
Next to visual cues, human speech is perhaps the second most im-
portant channel to extract emotion from. In particular, prosody, the
patterns of stress and intonation in spoken language, can be used to
read the emotional state of the speaker. For instance, when people are
happy, they tend to talk with a higher pitch. When sad, they tend
to speak slowly and with a lower pitch. Researchers have developed
pattern-recognition techniques (i.e., machine learning) to infer human
emotions from speech (El Ayadi et al., 2011; Han et al., 2014).
Finally, a robot can sense human affect from other modalities. For in-
stance, human skin conductance changes in response to an individual’s
affective state. A prominent example of the use of skin conductance as
a measure is the polygraph or lie detector. However, skin-conductance
sensors have been tried in HRI, with only limited success (Bethel et al.,
2007).
This material has been published by Cambridge University Press as Human Robot Interaction by
Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic.
ISBN: 9781108735407 (http://www.cambridge.org/9781108735407).
This pre-publication version is free to view and download for personal use only. Not for re-distribution, re-sale or use in derivative works.
© copyright by Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic 2019.
https://www.human-robot-interaction.org
120 Emotion
Figure 8.1
Emotions
expressed through
mechanical facial
expressions. Left:
eMuu (2001).
Middle: iCat
(2005–2012).
Right: Flobi
(2010). (Source:
Christoph
Bartneck and
University of
Bielefeld)
This material has been published by Cambridge University Press as Human Robot Interaction by
Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic.
ISBN: 9781108735407 (http://www.cambridge.org/9781108735407).
This pre-publication version is free to view and download for personal use only. Not for re-distribution, re-sale or use in derivative works.
© copyright by Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic 2019.
https://www.human-robot-interaction.org
This material has been published by Cambridge University Press as Human Robot Interaction by
Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic.
ISBN: 9781108735407 (http://www.cambridge.org/9781108735407).
This pre-publication version is free to view and download for personal use only. Not for re-distribution, re-sale or use in derivative works.
© copyright by Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic 2019.
https://www.human-robot-interaction.org
122 Emotion
Figure 8.3 The
OCC model of
Valenced reactions to
emotions.
Focusing on Focusing on
liking
disliking Legend
Consequences Consequences etc.
Self Agent Other Agent
for other for self
This material has been published by Cambridge University Press as Human Robot Interaction by
Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic.
ISBN: 9781108735407 (http://www.cambridge.org/9781108735407).
This pre-publication version is free to view and download for personal use only. Not for re-distribution, re-sale or use in derivative works.
© copyright by Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic 2019.
https://www.human-robot-interaction.org
Arousal
of affect.
AFRAID
DELIGHTED
TENSE ANGRY
DISTRESSED
ANNOYED
GLAD
FRUSTRATED HAPPY
PLEASED
Valence
SATISFIED
CONTENT
MISERABLE SERENE
CALM
DEPRESSED
AT EASE
SAD RELAXED
GLOOMY
BORED
DROOPY
TIRED SLEEPY
D P
formation alone (see Figure 8.6). Given that people struggle to cor-
rectly read emotions from still facial images, robots will certainly have
trouble with this as well. The addition of more information—such as
This material has been published by Cambridge University Press as Human Robot Interaction by
Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic.
ISBN: 9781108735407 (http://www.cambridge.org/9781108735407).
This pre-publication version is free to view and download for personal use only. Not for re-distribution, re-sale or use in derivative works.
© copyright by Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic 2019.
https://www.human-robot-interaction.org
124 Emotion
Figure 8.6 Can the context of the interaction, animated rather than still expressions
you tell if the of emotion, and body language—allows us to increase the recognition
tennis player just
rate, both by people and by algorithms.
scored or lost a
point? A study Another problem in emotion recognition by computers is that al-
showed that people most all algorithms are trained on emotions that have been acted out
struggled to by actors. As such, these emotions are exaggerated and bear little re-
correctly read
semblance to the emotions we experience and express in daily life. This
strong emotions
from the static also means that most emotion-recognition software is only able to cor-
faces alone, but rectly recognize emotions that are displayed with a certain exaggerated
they could, intensity. Because of this, their use in real-world applications is still
however, when limited (Pantic et al., 2007), and the recognition accuracy of subtle
only seeing the
body posture
emotional expressions drops dramatically (Bartneck and Reichenbach,
(Aviezer et al., 2005). Another problem is that most emotion-recognition software re-
2012). (Source: turns probabilities for only the six basic emotions proposed by Ekman,
Steven Pisano) or a point in a 2D or 3D emotion space. This is perhaps a rather limited
view of emotion and misses many of the emotions we experience in real
life, such as pride, embarrassment, guilt, or annoyance.
Another aspect of emotional recognition that poses difficulty for ro-
bots is recognizing emotions across a wide variety of people. Although
we may all be expressing a number of universal emotions, we do not all
do it with the same intensity, in the same type of context, or with the
same meaning. Interpreting the emotional status of a person, therefore,
requires a sensitivity to his or her individual affective quirks. Humans
become adept at this through long years of interacting with each other
but also through long-term experience with individuals. That is why
you might be able to tell that your partner is laughing out of annoy-
ance rather than happiness, whereas new acquaintances may not be
able to do so. Robots still decode emotions largely based on momen-
tary snapshots of a person’s countenance, and they do not develop
more long-term models of affect, emotion, and mood for their interact-
ion partners.
Finally, a robot’s emotional responsiveness can fool potential end
users into thinking the robot would actually experience genuine emo-
tions. A robot merely expressing a certain emotion does not replace
the actual, visceral experience of an emotional state. The robot merely
displays emotional states in response to a computational model. Af-
fective cognition, in which a full socioemotional repertoire is expressed
and recognized for different users and contexts, still remains elusive.
Questions for you to think about:
• Come up with a list of 10 emotions, and then try to display them
nonverbally to a friend. Can your friend guess which emotion you
are showing?
This material has been published by Cambridge University Press as Human Robot Interaction by
Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic.
ISBN: 9781108735407 (http://www.cambridge.org/9781108735407).
This pre-publication version is free to view and download for personal use only. Not for re-distribution, re-sale or use in derivative works.
© copyright by Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic 2019.
https://www.human-robot-interaction.org
Future reading:
• Christoph Bartneck and Michael J. Lyons. Facial expression
analysis, modeling and synthesis: Overcoming the limita-
tions of artificial intelligence with the art of the soluble.
In Jordi Vallverdu and David Casacuberta, editors, Hand-
book of research on synthetic emotions and sociable robotics:
New applications in affective computing and artificial in-
telligence, Information Science Reference, pages 33–53. IGI
Global, 2009. URL http://www.bartneck.de/publications/
2009/facialExpressionAnalysisModelingSynthesisAI/
bartneckLyonsEmotionBook2009.pdf
• Cynthia Breazeal. Social interactions in HRI: The robot view.
IEEE Transactions on Systems, Man, and Cybernetics, Part
C (Applications and Reviews), 34(2):181–186, 2004b. doi: 10.
1109/TSMCC.2004.826268. URL https://doi.org/10.1109/
TSMCC.2004.826268
• Rafael A. Calvo, Sidney D’Mello, Jonathan Gratch, and Arvid
Kappas. The Oxford handbook of affective computing. Oxford
Library of Psychology, Oxford, UK, 2015. ISBN 978-0199942237.
URL http://www.worldcat.org/oclc/1008985555
• R. W. Picard. Affective computing. MIT Press, Cambridge,
MA, 1997. ISBN 978-0262661157. URL https://mitpress.
mit.edu/books/affective-computing
• Robert Trappl, Paolo Petta, and Sabine Payr. Emotions in hu-
mans and artifacts. MIT Press, Cambridge, MA, 2003. ISBN
978-0262201421. URL https://mitpress.mit.edu/books/
emotions-humans-and-artifacts
This material has been published by Cambridge University Press as Human Robot Interaction by
Christoph Bartneck, Tony Belpaeime, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Sabanovic.
ISBN: 9781108735407 (http://www.cambridge.org/9781108735407).
This pre-publication version is free to view and download for personal use only. Not for re-distribution, re-sale or use in derivative works.