Emotion AI refers to a set of technologies -- natural language processing, voice tech, facial coding, neuroscience, and behavioral analytics -- applied to interactions to extract, convey, and induce emotion. Emotion AI is a presentation by Seth Grimes at AI for Human Language, March 5, 2020 in Tel Aviv.
4. “If we want computers to be genuinely
intelligent, to adapt to us and to interact
naturally with us, then they will need to ability
to recognize and express emotions, and to
have what has come to be called ‘emotional
intelligence.’”
-- Rosalind Picard, Affective Computing, 1997
Affective Computing
10. http://sentic.net
Plutchik via SenticNet
“As a framework,
SenticNet consists of a set
of tools and techniques for
sentiment analysis
combining commonsense
reasoning, psychology,
linguistics, and machine
learning.”
11. What about Sentiment Analysis? Text sourced…
https://dl.acm.org/doi/10.1145/945645.945658,
October 2003
https://dl.acm.org/doi/10.1561/1500000011,
January 2008
13. Sentiment -> Emotion points
• Both sentiment and emotion are subjective.
• Emotion models may be fine-grained and hierarchical.
• Emotion categories aren’t orthogonal; emotion is
compositional.
• Sentiment is situational and not simply a one-dimensional
projection of emotion.
Consider…
14.
15. Disclaimer
I use A LOT of commercial product images in the slides
that follow. These are illustrations and not meant as
recommendations.
17. Emotion AI
“Emotion mining is the science of detecting, analyzing, and
evaluating humans' feelings towards different events, issues,
services, or any other interest.”
Emotion synthesis enhances the ability of a machine to provide
meaningful, contextual responses, by conveying an appropriate
emotional state through words, voice, and expression.
Emotion induction aims to evoke a certain emotional response or
affective state.
Examples …
21. “Parsing Text for Emotion Terms:
Analysis & Visualization Using R”
Emotion in
Text: Parsing,
Stats
https://datascienceplus.com/parsing-text-for-emotion-terms-analysis-visualization-using-r/
26. Speech https://www.phon.ucl.ac.uk/courses/spsci/iss/week9.php
“Voice cues are commonly divided into those related to: (a) fundamental
frequency (F0, a correlate of the perceived pitch), (b) vocal perturbation (short-
term variability in sound production), (c) voice quality (a correlate of the
perceived ‘timbre’), (d) intensity (a correlate of the perceived loudness), and (e)
temporal aspects of speech (e.g., speech rate), as well as various combinations of
these aspects (e.g., prosodic features).”
http://www.scholarpedia.org/article/Speech_emotion_analysis
32. “Conversational Intelligence & Behavioral Prediction Insights from
Voice: Our Oliver engine offers ASR+ with a sophisticated layer of
emotion recognition metrics & behavioral KPIs, not only from what is
being said but also from the how it is said.”
33. https://www.ipsoft.com/2019/08/29/emotional
-intelligence-in-conversational-ai-why-amelia-
is-a-leader/
“When Amelia is working with customers or coworkers, she may sense frustration,
anger, or even sadness. In these moments, she’s able to calibrate her tone and
phraseology in order to be considerate of how her counterpart is feeling. With this
emotional intelligence, she can build closer connections and relationships between
a company and their customers and employees.
“The language Amelia uses to interact with customers can vary from system to
system and even from project to project. She can use slang, humor and even
sarcasm, or she can speak in direct professional terms.”
38. Counterpoint 2
Affect recognition, a subset of facial
recognition that claims to “read” our
inner emotions by interpreting the
micro-expressions on our face, has
been a particular focus of growing
concern in 2019—not only because it
can encode biases, but because it
lacks any solid scientific foundation to
ensure accurate or even valid results.