00:06PAMELA PAVLISCAK: We
anthropomorphize anything
00:10with a face with
irregular movement,
00:15with voice, with gestures.
00:19We have this emotional
bond with technology,
00:23even if it's not
really technology.
00:29PAMELA PAVLISCAK: Yeah, right?
00:31So we can't help ourselves.
00:34So Eric Pickersgill,
you probably
00:35saw this photography
exhibit from a while back.
00:38He took pictures of people
and removed their cell phones
00:43And that's why these guys look
a little bit lost because they
00:47lost part of their body.
00:50And we channel our emotions
through technology.
00:53And we have all kinds of
feelings about the internet
00:59and our experiences
with technology.
01:02How about Instagram?
01:05PAMELA PAVLISCAK: Fun.
01:06AUDIENCE: Distracting.
01:07PAMELA PAVLISCAK: Distracting.
01:12You know what I mean, like
when you go, like, deep
01:14back into somebody's profile.
01:16You shouldn't be there.
01:17And then you accidentally,
like, you like something.
01:24Design for emotion-- we
have a way of doing it.
01:27But we're kind of bumping
up against the limits of it
01:33I'll call it emotion 2.0.
01:35But we're getting this
confluence of machine learning
01:40and affective computing,
which is simply
01:43put going to be able to get some
kind of read on our emotions.
01:48The classical theory
of emotion says
01:50that emotion starts
in our bodies,
01:53then immediately know what
that emotion is instinctively.
01:57And we have a universal
expression on our face
02:02It turns out our brains
are not so neat and tidy.
02:05And emotion and cognition
are mixed up together.
02:09And so part of
emotional intelligence
02:11is going to be registering lots
of different intensities, lots
02:15of different kinds of emotions.
02:17What I came out of
this was is emotions
02:19are really super complicated.
02:22We just have to be
aware of this that we're
02:24making a lot of
assumptions about what
02:28is the right theory a emotion.
02:30And so I'm going to talk
about just a few ways
02:32that I think we can
broaden our perspective
02:35on emotional design.
02:36So this is an analysis of a
photo after a terrorist attack
02:43He's feeling mixed emotions.
02:45And the expression is ambiguous.
02:47And how we feel and interpret
those emotions inside
02:51is ambiguous and conflicted.
02:53And we need to leave room
for that and be OK with that.
02:59I think we want to think about
emotion as not just a place
03:02that we want to take
people to, but as something
03:06in the background that we can
know and have more empathy
03:10and build that into
our experiences
03:12because there are going to
be situations where we do
03:16We talk a lot about
humanizing technology.
03:20Give them human
characteristics almost
03:23without understanding fully
what those human characteristics
03:28I think we want to talk about
giving humans agency first.
03:32And a big way we do that is
through emotional intelligence.
03:36Finally, I think
respecting human emotion
03:39and not trivializing it is
what came out as very important
03:43to me out of all this work.
03:44This was-- it's the Mind in the
Eyes test developed by Simon
03:49Baron-Cohen, Ali G's dad.
03:51All the women's eyes
were pretty heavily
03:54made up and marked
as flirtatious.
03:57And then it was a bunch of
really older men looking
04:03And they're all white.
04:06And this is somebody who's
very respected in the field.
04:08We need to be really careful.
04:10And we need to be really
inclusive when we do this.
04:12When we think about
human intelligence,
04:15it's not very
intelligent without
04:19this emotional component.
04:21And so I really want
to think about emotion
04:26in a kind of more complex,
nuanced, and graduated way
04:31because I think this is where
we're going from something
04:33that's just simply
a point in time
04:36to kind of nurturing
or developing
04:38a relationship over time.
04:40And so I think at
this point because I
04:43do this for every
talk too, I want
04:45you to raise your right
hands and repeat after me.
04:48I do solemnly swear.
04:50AUDIENCE: I do solemnly swear.
04:51PAMELA PAVLISCAK: To
everyone in this room.
04:53AUDIENCE: To everyone
in this room.
04:55PAMELA PAVLISCAK: To
friendly robots everywhere.
04:57AUDIENCE: To friendly
robots everywhere.
04:59PAMELA PAVLISCAK: And to Pamela.
05:00AUDIENCE: And to Pamela.
05:01PAMELA PAVLISCAK:
That I will always.
05:03AUDIENCE: That I will always.
05:04PAMELA PAVLISCAK: Always.
05:06PAMELA PAVLISCAK: Respect
human emotion in design.
05:09AUDIENCE: Respect human
emotion in design.
05:11PAMELA PAVLISCAK: And with the
power vested in me as speaker,
05:15I now declare you
an amazing audience.