This document discusses a class on human perspective in artificial intelligence. It covers several topics:
1. Emotions in decision making research and how emotions can influence decisions in both helpful and biased ways.
2. How emotions could be incorporated into artificial intelligence systems, including recognizing emotions in others, expressing emotions, and studying neuroscience.
3. A framework is proposed for AI emotion research focused on recognition, expression, and neuroscience-inspired computing without direct access to internal experiences.
4. The class discusses modeling perspective and the self through a society of mind approach with distributed, semi-autonomous agents rather than a single centralized self.
APM Welcome, APM North West Network Conference, Synergies Across Sectors
Hpai class 25 - emotions in ai and self -051820
1. CIIC 5995-100 / ICOM 5995-100
Human Perspective in Artificial Intelligence
(HPAI)
Professor José Meléndez, PhD
“I think, therefore I am.” - René Descartes (1596-1650)
“I doubt, therefore I think, therefore I am.”
– Antoine Léonard (1732-1785)
“I feel, therefore I think, therefore I am.”
2. Today
• Emotions in Decision Research (continued)
• Emotions for Artificial Intelligence Systems
• Self and Mind
ScanQR Code to Verify your Class Attendance
https://forms.gle/newZj7do8D6KVPwz8
https://forms.gle/newZj7do8D6KVPwz8
3. Required Reading – Keep up the Pace
• Influence Tactics by Dr. George Simon Jr. (on Moodle)
• Excerpt of Chapter 6 of Character Disturbance: The
Phenomenon of Our Age
• The kinds of things we want AI to help us with.
• How Emotions are Made: The Secret Life of the Brain
• Chapter 6: How the Brain Makes Emotions
• Chapter 7: Emotions as Social Reality
• Chapter 8: A New View of Human Nature
• Chapter 9: Mastering Your Emotions
• Chapter 13: From Brain to Mind: The New Frontier
• The brain integrates, “so much information from multiple sources
so efficiently that it can support consciousness.”
4. Emotions – Decision Research Themes
https://scholar.harvard.edu/files/jenniferlerner/files/emotion-and-decision-making.pdf?m=1450899163
• Integral Emotions Influence Decision Making
• A Beneficial Guide
• Bias
• Incidental Emotions Influence Decision Making
• Unrelated Bias
• Moderating Factors
• Valence as Only One of Many Dimensions
• Differences of Emotions of Same Valence
• Appraisal Tendencies (Implicit Goals)
• Emotions Shape Decisions via Content of Thought
5. Emotions – Decision Research Themes
https://scholar.harvard.edu/files/jenniferlerner/files/emotion-and-decision-making.pdf?m=1450899163
• Emotions Shape Decisions via Depth of Thought
• Systematic vs Automatic Processing
• Role of Certainty
• Emotions Shape Decisions via Goal Activation
• Action Tendencies
• Motivations
• Emotions Influence Interpersonal Decisions
• Navigation of Social Decisions
• Emotional Communication and Expectation
• How to Reduce Unwanted Effects of Emotion
• Time Delay
• Suppression
• Reappraisal
• “Dual-Emotion Solution”
6. Next Up
• Emotions for Artificial Intelligence Systems
• Quantifying Self
7. Emotions - AI Research Themes
• “Recognition” – Emotions of Others
• Facial expressions (not unique)
• Physiological (not fingerprints)
• Behavioral (posture, eye movements)
• Verbal (sentiment analysis, vocal tones)
• “Expression” – Synthesis of Emotion or Emulation
• Synthesized vocal tones
• Flubber micro facial expressions
• Behavioral (posture and gestures)
• Neuroscience – Study of Brain Itself
• Methodological Problems
• Dead brains (Neurostatics)
• Constrained study environments (e.g. MRI)
• Limited external signals (e.g. EEG)
• No specific neural sources
• No innate, neural signatures
• No direct access to thoughts inclusive of emotional experience
• Dynamic and not inherently replicable – we grow and change
8. Emotion “Classification” – Towards AI
• One Dimensional Continuum (ODC)
• Emotion is thought - its “classification” is meaningless
• Continuous characterization of emotion in thought (experience)
• Intensity (absent good or bad)
• Time dependent and continuous
• Limited by working memory mental capacity for thought
• ODC model avoids classification altogether given that
emotions comprise a form of thinking and are thus time and
contextually dependent:
• Experience is different depending on initial state (e.g. your mood)
• Experience is different depending on context (e.g. who is around)
• Experience evolves as situation evolves
• Function is time varying – dependent upon all of thought
10. Science of Emotion – Towards AI
• Emotions characterized by attributes:
• Something that “happens to” you think.
• Intensities. “Flavors”: Positive, Negative, Neutral
• Occurrences Eliciting or intentional object (aboutness)
• Evaluations Enable pursuit of goals (serve function)
• Influence decisions Inhibit pursuit of goals
• Multi-component response experience
• Subjective (what it feels like)
• Body aspects (physiological including brain)
• Verbal/Nonverbal Communication Outward display of behavior
• Thought/Cognition – Probabilistic predictive computing with
uncertainty
11. Emotions in AI Systems
• Expectations: Predict, Simulate, Compare, Resolve Errors
• Attention:
• Must be used to signal processors regarding occurrences that may
indicate a situation of importance for self-maintenance (Body Budget)
or self-preservation.
• Must be used to signal processors regarding situations (or potential
situations) of interest or importance to itself (Affective Niche).
• Motivation:
• Must form part of the computations your mind makes as it constantly
predicts the world around it and what it should do within it (e.g.
motivation).
• Individual:
• Hundreds (or thousands) of semi-autonomous ”capable” systems with
inter-signaling, capable of learning and some adaptation.
• Non-Linear (Chaotic and Non-Deterministic)The processes and
concepts used in the resulting computations may not be completely
accessible or explicitly identifiable (affective realism).
• The processes and concepts used in the resulting computations may
not all be changeable.
• Mortal:
• Critical systems loss = death
• Self-Preserving (caring to not die)
12. AI Emotion Research Framework
• “Recognition” Interpretation– Emotions Communications of
Others
• Facial expressions (in context; not unique)
• Physiological (in context; not fingerprints)
• Behavioral (in context; posture, eye movements)
• Verbal (in context; sentiment analysis, vocal tones)
• “Expression” – Synthesis of Emotion or Emulation; External
Communication with Others
• Flubber micro facial expressions (learned, variable, and in context)
• Behavioral (learned, variable, and in context; posture and gestures)
• Synthesized vocal tones (learned, variable, and in context)
• Neuroscience – Study of Brain Itself and Artificial Implementation
• Neuro-inspired computing paradigms
• All-inclusive thought (conscious/subconscious) operating systems
• No direct access to computational meaning (probabilistic) inclusive of
“emotional experience”
• Dynamic and not inherently replicable – grows and changes
• Semi-Autonomous (“independent”)
14. Model Models – Towards Perspective
What about time?
What about emotional thinking?
15. Towards Perspective (Adapted Student Model)
Occurrences
Interoception
(INSIDE)
with Emotions
with Emotions
INDIVIDUAL
WITH MOTIVATIONS
EXPECTATIONS
ATTENTION
17. How an AI System Knows it Exists
• Concepts (formed from experiences)
• It makes regular predictions about reality (what is
happening) that are mostly correct.
• Semi-Autonomous agents validate each other
• It can move to touch itself and sense it. (proprioception)
• It can talk, hear itself, and understand what it heard.
• It chooses its experiences and learns from them
• “Physiology” - ”feels” when it deviates from its “norm”
• Affect: It knows things aren’t “right” even when it doesn’t
know what or why
• Body Budget (e.g. overall energy, imbalance)
19. Where are we?
• We understand that we continually create our own virtual realities
• An experience that can be similar to or completely different from a ”real world”
• We simulate based upon quantitative and qualitative inputs
• We understand that we are comprised of a collection of processes
• A brain is a thing
• A mind is a collection of processes (Society of Mind)
• Thoughts are processes not states
• Emotions are experiences and not states
• Emotions are just another way of thinking
• Statics are an obstacle in understanding mind
20. The Agents of the Mind
• Brains are built from mindless stuff
• Brains consist of components that do not think or feel on their own
• Minds are comprised of agents “doing” things – forget statics
Society of Mind, pp. 19-20
21. The Society of Mind
• Minds are comprised of a society of agents (mini-modules)
• More so, You are comprised of a society of embodied agents.
• The inner workings of your agents are mostly hidden from awareness.
Society of Mind, p. 21, 56
22. Self
Society of Mind, p. 39
• Your Self is all of you.
• You Self “is” what all of your You does.
• Physically (tangibly)
• Mentally (intangibly)
23. One Self or Many
Society of Mind, p. 40
• There is no singular, central, ruling Self inside the mind.
• We are are comprised of distributed functionality.
• We are not inside ourselves, we are Our Selfs.
24. Who is in Control of Self?
Society of Mind, pp. 42-44
• Who controls our minds?
• Our minds don’t control our minds.
• “Self-Control” – Scheming?
• Is Self function is to keep us from changing too rapidly?
25. Thought Experiments
Society of Mind, p. 58
• Thinking about thinking affects our thoughts.
• Consciousness is connected with our most immediate memories.
• Thinking also makes use of our most immediate memories
• How can we use our unique access to our Self to understand Selves.
• Special Scenarios:
• The time I lost my intermediate term memory
• Try to follow the track of a dream when I wake up during it
• ??
26. Thinking Without Thinking
Society of Mind, p. 63
• I am aware – hence I’m aware.
• We don’t know what is most of what is happening in our own minds.
• We interact and learn about our own minds similarly to how we interact
and learn about other people.
27. Momentary Mental Time
Society of Mind, p. 61
• It takes time for changes in one part of a mind to affect the other parts.
• Each different agent of the mind lives in a slightly different world of time.
28. Contemplation
• Your Self is all of you
• the functions underlying all of Layers 1-7
• You are comprised of distributed functionality within and about you.
• You “are” what all of your you does.
• Common sense implies readily a predictable result
• Is thinking implementation specific?
• How necessary is the real world?
• Transistors and Neurons don’t think…
• Thinking and Internal Communication