3. LIREC is a 5 year EU funded
project that explored the
implications of living with
interactive and robotic
companions.
www.lirec.eu
4. 2 things.
3.What we are doing with
robots now.
•What we are doing to
ourselves
5.
6.
7. A robot is a mechanical or
virtual intelligent agent
that can perform tasks
automatically or with
guidance.
8. Humans have a highly
developed brain and are
capable of abstract
reasoning, language,
introspection, problem
solving, self-awareness,
rationality, and sapience.
9. Robots & humans
share a form of intelligence
because it’s the easiest
thing to replicate.
15. “If the sum of positive emotions
is bigger than the sum of the
negative emotions over the last
n time periods then the agent is
in good mood or is in bad
mood.”
31. Humans have a highly
developed brain and are
capable of abstract
reasoning, language,
introspection, problem
solving, self-awareness,
rationality, and sapience.
32. Sapience wisdom, or the
ability of an organism or
entity to act with
appropriate judgment.
33.
34. We have to decide who we
are designing for, us or them.
Not that it matters.
Hi. I’m Alex, and I’m a designer. Right now I wear 3 hats. One running an internet of things design consultancy called Designswarm, the second founder of a startup called the Good Night Lamp and the third as part of a design partnership RIG.
One of the areas I was involved in for the past 2 years is robotics through a project called LIREC. This 5 year project finishing this
(http://vimeo.com/10799224) The work of Paul Ekman in creating the industry standard of Facial Action Coding System ( http://en.wikipedia.org/wiki/Facial_Action_Coding_System ) becomes useful to design the instantiation of our emotions, into physical facial reactions. It is a common standard to systematically categorize the physical expression of emotions, and it has proven useful to psychologists and to animators. Ekman showed that facial expressions of emotion are not culturally determined, but universal across human cultures and thus biological in origin . Expressions he found to be universal included those indicating anger, disgust, fear, joy, sadness, and surprise.
This is how phonemes ( the smallest segmental unit of sound employed to form meaningful contrasts between utterances) are divided up by animators. It’s hard to think about speech without thinking of the whole face though and the emotions that are communicated at the same time. Only text to speech bots really manage to sever the 2 which is also why they feel so robotic.
Little Mozart, a product of the industrial partnership at Lirec is what is called an Affective Tutoring System Sometimes emotions play a positive role in reasoning process, other times they don't but one thing is sure emotion influence the reasoning process and also influence the learning capabilities. Emotions like anxiety, depression or hungriness may reduce or even block the learning so, like teachers already know, emotional upsets infer in learning capabilities. Some teachers adapt their behaviour to, within their possibilities, improve students learning capabilities. Little Mozart tries to do the same. The robotic scenario had a more immersive user experience, an improved feedback and a more believable social interaction. Little Mozart helps children compose music and improve their knowledge of melodic composition and basics of musical language. It's intended for children ages between 4 and 10 years old.
However, for many laymen, if a machine appears to be able to control its arms or limbs, and especially if it appears anthropomorphic or zoomorphic (e.g. ASIMO or Aibo), it would be called a robot. http://en.wikipedia.org/wiki/List_of_fictional_robots_and_androids#1970s If you’re of a particular disposition you will see robots everywhere in nature and objects around you. These are pictures taken without permission from the photo sharing site Flickr’s “Hello Little Fella” group where people have been aggregating photos of everyday objects that look like faces for years now.