00:06as most of you know who are our regulars
00:10this series is really focused on
00:12creating a space to explore the role of
00:15design in shaping the future so we're
00:19really excited to have this topic here
00:21today and we have a group of researchers
00:24and designers experts in the field of
00:27autonomous vehicles and I'm really
00:30excited to have them here today they're
00:32going to share with us what they're
00:34learning as they work to make autonomous
00:36vehicles a valuable part of our future
00:39so I'm gonna introduce our three
00:43panelists or speakers we have Ryan
00:46who leads the UX design team at way Moe
00:49hired away Moe he led design teams at
00:52Google Samsung Xbox and Motorola he has
00:57a master's degree in design from IIT
00:59Institute of Design and his passion for
01:03designing for people and his belief in
01:06techno technology can improve our daily
01:10lives has really been the driving force
01:11behind his career so he's very invested
01:15in this space Wendy Joo right here
01:18currently teaches at Cornell in the
01:20information science program she has a
01:23PhD in mechanical engineering and a
01:24master's of Media Arts and Sciences from
01:27MIT she focuses on human and robot
01:30interactions and she published a
01:33monograph on the design of implicit
01:36interactions in 2015 we're really
01:39excited to have her here and Melissa
01:41Sathyan is a principal scientists and
01:44design anthropologist at Nissan Research
01:47she has a PhD in cultural anthropology
01:50from Bryce she previously worked at IBM
01:53research sapient Corporation and the
01:56Institute for Research on learning sense
01:59of background there and she explores
02:01togetherness and autonomy as they come
02:04together in advancing self-driving cars
02:07so we've got some really really great
02:09folks they are each going to give a
02:13and then we'll jump into our
02:16conversation so with that I'm going to
02:18hand it over to Ryan who's gonna come up
02:20first thanks for coming out tonight and
02:26I'm excited about this about the time I
02:29get to spend with you because this is
02:30the first time I've actually shared any
02:31of the work that we're up to it way mo
02:33with a design audience and so it's
02:37exciting and it's a little bit
02:38nerve-wracking as well but you know you
02:41can I appreciate everybody coming out
02:42and it's kinda mentioned I lead up the
02:45the UX design team at way mo I've been
02:48there for a couple years and I guess
02:50what's notable about the stops on my
02:53career is that I have focused mostly on
02:55consumer products as a designer and as
02:57way mo transitions into actually
03:01becoming a service this year we're very
03:05much focused on the consumer experience
03:07and so how does that how does the
03:09experience work from being picked up in
03:11a car to riding in a car to reaching
03:13your destination and so that's what I'm
03:15going to talk about about tonight okay
03:18so you like to say it way mo that we're
03:21building the world's most experienced
03:22driver I think that's a really good way
03:24to think about the technology that
03:25exists in the cars today
03:27and so in Phoenix we've we've announced
03:29that we have since last year we we
03:32actually have people that are using our
03:34service on a daily basis and so they're
03:36taking our cars to work to home you know
03:40to the to the mall and and so what I'm
03:42going to do is I'm going to share a
03:43little bit of what that experience is
03:45like when they're inside the car and
03:47what they see inside the car so the
03:50reason why that we think when you're a
03:52passenger are a writer in the car we
03:55think that it's important to to think
03:58about the communication that happens
04:00today between you and a human driver so
04:02today when you're riding along with that
04:04with a human driver there are two types
04:06of communication that that happens and
04:09it can be direct communication so you
04:11could ask that that driver like why are
04:13you taking this route or why aren't you
04:16you know why don't you going because the
04:17the and the light may be green but it
04:20could also be indirect in the sense that
04:22you might notice that the driver is
04:24repositioning their hands in the
04:25steering wheel so you might get a
04:27like okay we're about to make a left
04:28turn or you might see that the driver is
04:31watching a set of passengers across the
04:33street and so that's why we're kind of
04:35waiting and we're not moving forward and
04:37so as designers when we think about
04:40being a passenger in a car that doesn't
04:42have a human driver we are thinking a
04:45lot about how to recreate that
04:46communication because we think that
04:48that's really the crux of building trust
04:51between passengers and in our technology
04:54so when you get into one of our
04:57Pacifica's today this is what you would
04:59actually see so this is the interface
05:01that exists in the car today and and
05:04again the purpose of the or the goal of
05:07this interface is to is to build trust
05:09between our technology and our writers
05:12and so I'm going to walk through a
05:13little bit of the interface design that
05:14we've been working on and talk about
05:16some of the different elements some of
05:19the different decisions that we've made
05:20as designers is we're kind of thinking
05:23about how this interface how this
05:24interface works so you may have seen
05:27pictures like this before this is what
05:29we call internally we call this X view
05:31and it's basically an image of what our
05:33our cars can see based on all of the
05:35sensors that it has and so our engineers
05:38use this for development and I usually
05:41like to kind of joke that this was not
05:43designed by a designer this is very much
05:45an engineering tool but as designers at
05:48way mo what our focus is is taking all
05:51of the information that the car can see
05:52and the consents and we want to
05:55translate into something that is
05:57something that's more usable by a
05:59passenger or a rider who just wants to
06:01go from point A to point B I think it's
06:03obvious you know for you know think
06:05about ourselves as designers that this
06:08is something that's way too complex it's
06:10way too dense in terms of the
06:11information that it's displaying so when
06:15we think about the interface that you
06:17see in the car as a passenger the first
06:19thing that we do is we kind of think
06:20about this 3d scene that were that we're
06:22looking at and what we're doing here is
06:24we're trying to convey what the car can
06:26see around it and so you can see here we
06:29have you there's a green path that shows
06:32the cars trajectory where it plans to go
06:34there's a difference between the the the
06:37road surface and the non road surface
06:40to the crosswalks that we've chosen to
06:42draw a render in the UI there are
06:44passengers in this scene in this frame
06:46and there are also cyclists there
06:48there's a subtle difference when you
06:50look at the cyclist for example there's
06:52a sort of a blue pill or blue sort of
06:55shadow underneath that cyclist versus a
06:57slightly brighter or wider one
06:59underneath the pedestrians we've chosen
07:01for example to render passenger or to
07:05render pedestrians as using the laser
07:07points that we actually have access to
07:09and what's cool about that is that on
07:11show a couple of videos here in a second
07:13is you can actually see their arms and
07:15legs move same thing with a cyclist but
07:17when it comes to traffic for example the
07:19nearby vehicles we render those more as
07:22simple shapes what we're also doing in
07:25this 3d scene is to provide some context
07:28about where you're at on your journey
07:30when you're in the car is we also bring
07:32in buildings for example so this is
07:34built this is this is data that we bring
07:36in to to give the user a sense of just
07:38context again of where they're at as or
07:41is it going from point A to point B one
07:44of the things that surprised me the most
07:46when I joined the team is we our car can
07:50we're like changing road conditions and
07:53so it might on a certain day we might
07:56approach a streets where there's now a
07:58construction zone and what we've chosen
08:00to do in the UI here is we've chosen to
08:02render these these traffic cones and
08:04what's really interesting is when we get
08:06people in the cars and they see this
08:08it's one of those magic moments where
08:11they're really surprised that the car
08:12can see that there are 12 cones that are
08:14actually here but the reason that we do
08:16this is because you know a construction
08:19zone can be a situation where it can be
08:23there's a potential there for a little
08:24bit of a high-stress in terms of the
08:26writer making sure that they understand
08:28that the car actually understands that
08:29there's a construction zone here it's
08:30not the usual road conditions that are
08:32happening and so that's one way that we
08:34kind of acknowledge that so it's again
08:36it's kind of that going back to that
08:38example before between the human driver
08:39and the writer it's an example of like
08:42the indirect communication that might
08:43happen if you happen to glance up at the
08:45screen and notice that you know that the
08:47reason we might why we might be taking a
08:49little bit longer to kind of like slowly
08:51kind of navigate this situation is
08:52because we're in this this
08:54ruction zone the same thing happens with
08:57emergency vehicles so that's another
08:58situation where if you hear an emergency
09:00vehicle and it's approaching the the car
09:03that you're in there's a potential again
09:05that it could be a high stress moment
09:06and so it's just a subtle example of how
09:08we we can distinguish the car knows the
09:10difference between an emergency vehicle
09:12and a regular vehicle and so in the user
09:15in the interface that we're working on
09:16we've chosen to make that distinction as
09:18well what we also spend a lot of our
09:21time doing is trying to figure out the
09:23right time in the right place to bring
09:26your attention to different elements
09:28that are in in the UI and so when we
09:31approach an intersection for example
09:33what we do is we highlight the
09:35crosswalks that are active and what I
09:38mean by active is that those crosswalks
09:40hafidh St ins in them so they're the
09:41ones that matter and so in this scene
09:43right right here where the car is trying
09:45to make a right turn it's actually
09:46yielding to those pedestrians that are
09:48that are in that the crosswalk that's
09:50closest to it so it's a subtle way that
09:52we again try to draw attention to
09:54different elements in the UI so that
09:56it's not overwhelming and I'll show some
09:59examples here where the scene is
10:01actually moving another example is when
10:04we are approaching a right turn what we
10:07do is we change the camera angle again
10:10based on the information that we hate
10:12what we see but what we're trying to do
10:14here is we're trying to mimic what
10:15happens as a human driver where you kind
10:18of you know you're making that right
10:20turn and you turn to your left to see
10:21what's coming from the from the left
10:23hand side so it's just a bit of a signal
10:26there to emphasize that's what the car
10:28is doing so it's it's making that right
10:29turn but it's paying attention to the
10:31objects that are coming from the left
10:32you can kind of see - and that example I
10:35show there's a bit of a pulse where we
10:36reveal some of the laser points every
10:39few seconds and that's just an
10:41indication to the user that there's a
10:43lot of data that we can see and sense
10:44but the reason why we work that we
10:47choose to do it sort of periodically so
10:49that it doesn't overwhelm the user with
10:51a lot of information on the screen so if
10:54we think of that 3d scene as being
10:57really representative of what the car
10:59can see around itself and as a writer or
11:02passenger you might want to glance up
11:04and make sure that the car can is
11:07you know that I can see you know
11:08different pedestrians or a cyclist or
11:10nearby traffic we have what we call this
11:13the status layer this 2d layer that we
11:15layer on top of our of our interface and
11:17the way I like to think about this is
11:19it's more of our direct way for us to
11:21communicate with you so when it comes to
11:23you know it could be the fact that we
11:26recognize the traffic light here is
11:28green that you're on this the street
11:30that we're also in a school zone this
11:32could be a moment where most most of the
11:35time a user might be you know heads down
11:37on their phone but they might get a
11:39sense the vehicles moving a little bit
11:41they might kind of you know ask you know
11:42why what's that you know what's the car
11:44doing why is it going so slow and this
11:46is again this is the way that we can
11:48provide a little bit more information
11:49that hey we're in a school zone and
11:50that's why we're actually going at this
11:52at the speed this is another example
11:55here where we're approaching an
11:58intersection and again because the car
12:00can you know it has these these these
12:04qualities that a human driver doesn't
12:06have so it's able to see for example in
12:08360 degrees and so what it's doing here
12:11is it's it's yielding to the pedestrians
12:12that are in this crosswalk but if you
12:14were in the car you might have you might
12:15wonder why the car is not just turning
12:17right and so this is an example with
12:20these with this type of messaging where
12:22we're being a little bit more direct
12:23about what the car is thinking or what
12:24the car is doing and it's our it's our
12:27way to again to kind of signal you know
12:29to the user what what the car is doing
12:31we do the same thing when you're getting
12:33close to your destination so the car
12:37when it you could imagine maybe
12:38navigating into like a busy shopping
12:40center with a parking lot a lot of
12:42traffic a lot of pedestrians around and
12:44what would happen here is we kind of
12:46give some indication of what the car is
12:47doing so it's kind of it's making its
12:49pole over trajectory but it hasn't quite
12:51yet come to a stop now what you're not
12:53seeing here too is our hearing is that
12:55we also incorporate a lot of sound in
12:57here and so actually Connor O'Sullivan
13:00is our is our partner when it comes to
13:02sound design he's a sound designer at
13:03Google and so we're thinking a lot about
13:05you know the audio experience as well so
13:08yeah so that's in a nutshell that's some
13:11of the work that the that the team is
13:13doing and I wanted to get a quick plug
13:15in here because I am doing some hiring
13:17you know you can check out way mo comm
13:21and we have a couple positions that are
13:22up there so so Thanks well thank you for
13:30having me and thanks for that I love
13:32some of what you've been the same I'm
13:34gonna echo almost some of the same
13:35points about communication so I'm
13:38Melissa Sefton I lead a user experience
13:41group at the Renault Nissan Mitsubishi
13:44Alliance Lab Innovation Lab of Silicon
13:47Valley and we are taking on exploring
13:52sort of all aspects of autonomous
13:53vehicle development also in the lab is
13:56work that goes on with connected cars
13:57and mobility services so we have sort of
14:00a large scope of work and my team so far
14:04has been focused on some particular path
14:06I've been there three years and it's a
14:08relatively new lab that we've been in in
14:10a bit of a mode but where we have many
14:12other kinds of problems that we could
14:15explore and hopefully some of that will
14:17come up in our question and answer so I
14:20give a little title to my talk because I
14:22kind of wanted to point out you know
14:25what we're working on what kind of space
14:27give you a little sense of some of the
14:28kinds of projects we've worked on but
14:30also already start the conversation
14:32maybe a little bit about what is it as
14:34designers and people in this space that
14:36I think we need to be thinking about and
14:38how do we do that and I need some help
14:40in thinking about some of the problems
14:41and this is what worries me so we'll end
14:43with a worry so our UX space is been
14:48focused in some particular questions
14:51that we're asking how should autonomous
14:53vehicles behave and this question if you
14:57already start to think about it is
14:59rather different than Ryan's which is
15:01you know not let maybe a little less
15:04concern with what is the vehicle doing
15:05but starting with once the vehicles
15:08doing what it's doing how do you keep
15:09the customers and the people inside
15:11engaged we look at it from the outside
15:13and we say if people are interacting
15:16with this vehicle from the outside as
15:18other road users I think of these as
15:20passive users that are constantly
15:23encountering vehicles what kind of
15:24experience is that and what messages are
15:27we telling there and that's where we
15:29also use a framework of direct and
15:31indirect communications that happen on
15:34the road and I think we'll look at some
15:35of that we also have the question of how
15:38should then autonomous vehicles interact
15:41so one thing is the way in which they
15:43move about on the road the other is when
15:45they do encounter people what kind of
15:47interaction is that and then finally the
15:50area that will start to do more work on
15:52there are other people in our lab who
15:53have been doing it but we've done a
15:54little less from the UX side is what
15:57might they do for people what kind of
16:00services and service design kind of
16:03opportunities are are there for for what
16:06they're doing hopefully I'm going to get
16:07this right so that translates into our
16:09work translating into we do do work that
16:11helps the AV development in the
16:15artificial intelligence team the core
16:17technology system developed their
16:20algorithms for the vehicle we also do
16:23HMI work both inside the vehicle but
16:25again I'm going to talk about what we've
16:27done for the external part of the
16:28vehicle and then work on Robo service
16:31vehicles and what that would look like
16:33so let me talk a little bit about these
16:35this question of how autonomous vehicles
16:38might behave and interact with people
16:41and we do so I'm an anthropologist you
16:43heard that I we just multi mostly social
16:46scientists we really have a strong
16:47favoring of field studies and empirical
16:51work to see what happens today and it's
16:53not because we want to recreate exactly
16:55what happens today but we're going to
16:57have to get people from how it is now to
16:59how it will be in the future and so we
17:01also want to understand what these you
17:04know systems are moving into as
17:07environments so let's just take a look
17:09at what you might see if you go to
17:11Campbell California just down the road
17:13four-way stop which is kind of a crazy
17:16American thing um people moving about
17:19family out on their bicycle trip cars
17:23moving through intersections following
17:25the rules as best we can tell whoever
17:27arrives first then the people to the
17:29right are going to go whoops we had a
17:32little bit of a you know drama there
17:34over who was going to go first that
17:36pedestrian or that car people carrying
17:39on another little moment here where
17:41people have to stop and figure it out
17:44and negotiate it what this if we look at
17:48closely and I'll show another the same
17:50video now sort of broken down you'll see
17:52it's just filled with these interaction
17:54moments and so the question we're asking
17:56is what what happens if you remove a
17:59driver and over time as people become
18:02acclimated and they're maybe not paying
18:03particular attention to this autonomous
18:05vehicle in those kinds of moments so
18:08let's look at it again now broken down
18:09you see a pedestrian highlighted over
18:12there on the right that pedestrian in
18:14this vehicle the white vehicle that SUV
18:16have this little conflict over who's
18:19going first and they both go and they
18:21both seem to think the other person
18:22stopping or an art paying attention to
18:25meanwhile these two vehicles arrive at
18:27just about the same time and that's
18:29arguable who really was there first so
18:31they have to figure it out and then you
18:33have these pedestrians where she quickly
18:35like stops the gentleman from going
18:37thinking that it's the cars turn the car
18:39driver meanwhile waves them on so
18:42there's a very explicit kind of
18:43interaction there so what what this
18:47tells us is that when people are on the
18:48road they're constantly reading and
18:50interpreting what's happening and it's a
18:52beautiful dance of kind of micro
18:54coordination that people are engaged in
18:57it's also moments of sociality we wave
18:59it each other we we smile at each other
19:02we flip each other off I mean it's not
19:04always beautiful sociality but we have a
19:06very human experience in those moments
19:08so what does the future hold as that
19:10begins to shift the thing is also that
19:13it's not always quite like that so if
19:15you start to look around and see how the
19:18conventions of movement practices they
19:20can be quite varied so here let's look
19:23at a different setting this is a large
19:25Boulevard it'll are very large
19:28roundabouts in Tehran and this is how
19:31you move in Tehran or large cities in
19:34Iran you basically everybody keeps going
19:36and you weave in and out of each other
19:39it's it's a different approach but it
19:43works not always but it does work it can
19:46be a little daunting this was actually
19:48not a very busy day so there's a lot of
19:51weaving but again there's a
19:52communicative practice that people are
19:54engaged in there's also expectations and
19:57conventions about what to expect and how
20:01now we are very good as humans when we
20:03arrive somewhere we can adapt we can all
20:06kind of travel around the world and walk
20:07across the street we can even often rent
20:09cars and drive around but that's because
20:11we have all sorts of filters for
20:13distinguishing is something different or
20:15not if it's different do I want to do
20:17the thing that's different or am I going
20:19to you know sit back and do how I want
20:21to do but at least I can see what's
20:22different and we can make that judgment
20:24immediately we don't need reams of
20:26machine learning data first to read go
20:28through and then redirect us to tell us
20:30what's different so those are the kinds
20:33of things that we look at to ask the
20:35question of oh and I have one more this
20:37one is really fun to watch the guy
20:39especially watch the guy up in the top
20:41corner who's coming this is another
20:43Iranian city again people kind of
20:45everywhere the guys walking down the
20:46middle of the roundabout here we have a
20:50young man who's running to get somewhere
20:52people walking arm-in-arm through the
20:54traffic what's interesting about it keep
20:57your eye on him because if there's an
20:59interesting moment coming is that what's
21:03interesting in this case is this
21:05particular city it's called my shed it's
21:07in the northeast corner of Iran and it's
21:10actually a pilgrimage site so chances
21:11are a lot of those people actually
21:13aren't from there and they're not locals
21:14because they get visitors from around
21:16the world who come for this pilgrimage
21:18and so who knows who those people are in
21:21the road and and what that kind of
21:23adaptation is so what do we do with this
21:28two minutes left onion very quickly say
21:30we work with a system development team
21:33system architecture you all know these
21:34kinds of pictures to think about the
21:37decision-making practices and whether
21:39and how they can or should be adapted to
21:41become what we call socially acceptable
21:44are there different social practices and
21:47expectations that you can detect in part
21:50through you know speed and velocity and
21:53density of everything around you but
21:55there's still differences a college
21:57campus expectations is somewhat
21:59different from downtown San Francisco
22:00even if you can see similar densities of
22:03people and stuff so we work with them on
22:06motion behavior and then the other area
22:09is we've been exploring whether there is
22:11an opportunity to or a requirement
22:14to add new signaling to the outside of
22:16the vehicle what week we've been
22:18focusing on the problem of communicating
22:20intent how do people do that in in
22:23traffic and would we want to add
22:26something that we would call an intent
22:28indicator so we've been developing
22:30concepts and testing those concepts for
22:32intent indication and this is a really
22:35really really hard thing to test and we
22:37can talk about this maybe more in the QA
22:39because there's all sorts of assumptions
22:41about practices if we go test it with
22:44people today nobody knows to look for
22:46this thing we all have expectations of
22:49we're not encountering autonomous
22:51vehicles at scale is just kind of a
22:54one-off so you know what will it mean to
22:56understand if this new signal could
22:58become a resource that we weave into the
23:01fabric of how things work so those are
23:03some of the projects we're working on I
23:05just wanted to say the worry for a
23:07moment as you've been seeing you know
23:10we're really concerned with these kinds
23:12of public environments that vehicles are
23:14moving into and you know we have lots of
23:17history that suggests that all hasn't
23:19always been the happiest history of kind
23:23of who rules and dominates in these
23:25different environments and so the
23:28question is how do we account for public
23:32environments and designing for these
23:34kind of socio technical multi agent
23:38multi object object spaces that we don't
23:41control all the components but what we
23:43put out in the world is going to have an
23:45effect on people's everyday sense of
23:48pleasantness interactions safety
23:50confidence in what's happening on the
23:53road so that's where the word comes in
23:55any ideas people have on how best to get
23:58there we welcome so thank you
24:05so who I am I'm Wendy G I'm a system
24:09professor at Cornell tech I actually
24:10very recently until very recently worked
24:12at Stanford in California College the
24:14arts so I'm still think of myself as a
24:16Bay Area person I'm actually a human
24:20robot interaction designer and I kind of
24:23got into cars because of this statement
24:25a car is basically a robot that you sit
24:28inside of and that's increasingly the
24:30case I mean it's not even just the the
24:32way mo cars there's so much additional
24:35automation that a lot of what's going on
24:37in the cars is like the car is no longer
24:38just a mechanical system that you
24:40control with your hands but in fact
24:41something that you actually have to
24:43negotiate and cooperate with in order to
24:45just drive down the road my background
24:47in human robot interaction kind of has
24:50given me a lot of skills and techniques
24:51that we bring to looking at how we
24:54interact with cars we current-day cars
24:57on the road but also future autonomous
24:59cars and so I'm gonna show you some of
25:01those things some of those techniques
25:03and that's going to be the main thing
25:04I'm gonna cover in this next so many
25:06minutes I'm gonna start out by showing
25:08you some of the work that I do in human
25:09robot interaction one of the things
25:11that's a really prevalent technique in
25:13human robot interaction is a technique
25:15called Wizard of Oz and for those of you
25:17don't know that's something where you
25:18actually have a person behind the
25:19curtain actually controlling a device or
25:22interface or robot or car so you can
25:25actually pretend to be the automation
25:28you're pretending to be the machine and
25:30by performing the machine you're able to
25:32conduct contingent interaction this
25:35actually started when people were doing
25:36natural language understanding and they
25:38were making systems and if you try to
25:39write a conversation it what you thought
25:42people would talk about would break down
25:43that kind of - you know relays in and so
25:46it was important to have a person on the
25:47other side you know actually talking to
25:49people but pretending machines you could
25:51develop corpuses of what you actually
25:53need to recognize and talk about so in
25:55more physical mechanical terms we
25:59actually puppeteer robots and then see
26:01how people interact with them so here's
26:02a video of some of the work we're doing
26:04with the robot garbage can
27:53you okay are you okay
28:12so I actually know melissa pretty well
28:15and one thing I always love speaking
28:16with ethnographers because the sort of
28:18research we do is sort of like
28:20ethnography we'd like to go to the field
28:21we like to see all the different
28:22reactions you have you really can't
28:24anticipate what's going on but we as
28:26designers often like do funny things to
28:28change the environment and experiment
28:30with how people gonna behave in the
28:31future and so that that's like part of
28:33the technique and that's a technique
28:34then that we're bringing to looking at
28:36how people interact with automated cars
28:39we actually have this system that we
28:41call an on the road driving simulator
28:43and that sounds funny but what we do is
28:45we actually have people sit in what
28:48feels like autonomous driving scenario
28:50by having them sit in the passenger seat
28:53of a car but we tape on a steering wheel
28:55in front of them and there is like a
28:56nine dollar piece of foam core next to
28:59them so they can't see the actual driver
29:00and we just tell people like pretend
29:02you're an autonomous car you're going to
29:03your friend's house yeah and we actually
29:06give them something to do so we can see
29:07when they look up and when they're
29:08nervous and if they get motion sick and
29:10things behind the scenes there's
29:12actually a person who's driving the car
29:14that person is driving the car as
29:16smoothly and robotic ly as we can we
29:19actually spent some time actually
29:20tailing formerly not a Google X car is
29:24just to kind of like pick up what is
29:26there was their style we would mimic
29:28that style and then the backseat we have
29:30an interaction wizard and that person is
29:31actually the person who's responsible
29:33for running the experiment like
29:35signaling things that are coming up on
29:37the road and we were we did some
29:39experiments to see like whether if you
29:41pre cue the different things that are
29:43going on in the road if that helps
29:44people with motion sickness or
29:47I thought improves things but like with
29:49most field experiments the things that
29:51end up mattering are not the things that
29:52you went in there to test is all of
29:53these other things that matter to people
29:54so here's some video showing this in
30:02I guess the computer is pretty cautious
30:06which is pretty awesome
30:08there's a number of obstacles that I
30:10could have imagined being pretty
30:11troublesome for a computer card it's a
30:13v8 to them all lot of people on the side
30:16of the road that's for the small street
30:17the instruction and like you know like
30:20traffic in all these different
30:22situations where it just basically did
30:24the right thing it was much better
30:25driver than Austin so how did the might
30:29go in gender dumps I was impressed it
30:33did you feel of 17 I did there was a
30:36construction site that we went past
30:38there was a construction vehicle and the
30:40guy was waving the car back up and then
30:53the guy made more hand signals video
30:55docile ecoo and I just don't fully trust
30:58that car to drive on its own even though
31:01I had no bad experiences of this car
31:03it's just seems strange to me still that
31:05car can drive it so we would not turn
31:09them universe country please say did you
31:17notice the car together I did
31:19aah I was an engine roar no I thought
31:26it's very friendly who said you had
31:28imagined was what I imagined it was
31:30smooth cautious but normal driving
31:33experience kind of what makes it cool I
31:34think I was surprised how much I trusted
31:36it even from the beginning when it said
31:38hello about one thing I gave it enough
31:40personality for me to trust it made me
31:42feel like even though it wasn't a human
31:44it wasn't of malicious intent
31:47it just was driving carefully slowly
31:49actually I remind me of like a
31:51driving-school instructor was this
31:53pattern of diving okay with you know
31:55I can definitely all of these how do I
31:57by leading in ten years
32:01okay so one of the things I like I went
32:04just gonna focus on this last a little
32:05bit like this guys was asked like do you
32:07trust the vehicle and he said yeah I
32:09mean they said hello you know even
32:12didn't have malicious intent so when
32:15people who work on human factors talk
32:16about trust in automation they mean do
32:18you think the car is competent at what
32:20it's supposed to do you know do you feel
32:22comfortable like when you're overseeing
32:24like knowing that it's doing what it's
32:25supposed to do what they don't mean is
32:28it likes me it's not trying to hurt me
32:31but that is the way that people
32:33interpret the word trust and not just
32:35like because they don't understand the
32:36questionnaire question like that's
32:37literally the kind of things that they
32:38think about so if you have the car say
32:40hello one they like it more but you
32:43might also be less likely to intervene
32:45because you know it's your friend
32:46so there are some really interesting
32:48things that come out of this kind of
32:49research that I think if you don't
32:50actually run field experiments open and
32:53experiments you wouldn't find out we
32:56also do some work looking at existing
32:57automation actually this is some work
32:59that we did with Renault and Nissan and
33:01the setup for this is kind of cool in
33:04the previous study we actually had two
33:06people in the car with the with the
33:08participants in this setup we actually
33:10are just using like the fact that you
33:12can actually do broadband like video and
33:14audio like really well over cellular and
33:16so we actually have people who are in an
33:19office in Sunnyvale watching video of
33:22people driving watching the road
33:24conditions around them looking at what's
33:25happening in the cabin and then they're
33:27using a text-to-speech system to ask
33:28people about the experience in the car
33:30and we actually used this to interview
33:32people about the experience with
33:33existing automation with the q50 because
33:35even though these cars are on the road
33:37the people who are the controls
33:39engineers don't actually totally
33:40understand what the user experience of
33:42using those things are so this is this
33:44is existing commercial product the
33:47designers don't have a good handle on
33:48that feeling so it's really useful for
33:50them to be able to watch people as
33:52they're driving so here's an example of
33:54this the the screen we're seeing is what
33:56the Wizard is seeing and then a lot of
33:59the things that you're seeing on the
34:00bottom are what the telematics from the
34:02car kind of spitting out and at the very
34:04bottom is something that's the
34:05text-to-speech interface that is the
34:06what the wizard can actually ask
34:09do you feel that car changes things
34:11smoothly while engaged they mean this is
34:29something that you know as designers we
34:31can actually use to understand the
34:32experience of being on the road better
34:34like one of the things is there's all
34:36these people are really excited about
34:37data science and instrumenting
34:39everything right now and getting all the
34:40data and having all the video but still
34:42even in this day and age the best way to
34:44find out how people feel about like
34:46something like automation is to ask them
34:48and it's incredibly powerful to be able
34:50to ask people how they feel right in the
34:52moment you know not to ask them to try
34:54to remember the drive after the fact
34:55this last piece is also kind of using
34:58Wizard of Oz but in a slightly different
34:59way and we're looking at how people who
35:02are pedestrians and autonomous cars are
35:04can interact and one of the things that
35:05we're particularly interested is what
35:06what's gonna happen in the future when
35:08you can't have safety drivers when you
35:09don't have safety drivers behind the
35:11wheel actually one of the things we've
35:12felt is that very often when people are
35:14crossing in front of autonomous cars
35:16they're interacting with the safety
35:17drivers even though the safety drivers
35:18aren't actually controlling the car this
35:20is actually a little bit problematic so
35:22we were interested in what would happen
35:23when in the future when you actually
35:24don't even have anything to look at what
35:26will people then resort to looking at
35:27and negotiating with
37:17so in this study it's also kind of a
37:20field study um one of the things that
37:22was interesting is it took a while to
37:23get to a place that we could actually
37:26ask people about their interactions with
37:27autonomous car because despite what we
37:30all know we all do we always make eye
37:32contact with the driver and establish
37:33their intent and then we cross like that
37:35actually very rarely actually happens we
37:38would actually chase people down after
37:39they cross the road and ask about the
37:41car and they'd be like was there a car
37:42more like the autonomous car they like
37:44was there an autonomous car you know so
37:47what we actually were like you know you
37:48looked at it you aimed at it we saw you
37:50know basically walking down the street
37:53is such an automatic behavior that if
37:55nothing exceptional was happening people
37:56just like basically perform
37:58automatically and across the street and
38:00they have a way of dealing with it but
38:01they're not even thinking about it
38:03in some of the videos you might have
38:04noticed that once people enter the
38:05crosswalk we eased in a little bit and
38:07then we stopped and that made people
38:08look up and that is the people that we
38:11were actually able to interview about
38:12the autonomous car like what were the
38:14people that that we actually did that
38:15with because that causes breakdown
38:17there's some expectation of what the
38:19behavior is and when that breaks down
38:20that's when the automatic behavior goes
38:22away and people then are actually
38:23actively deciding what to do and
38:25negotiating what's going on and seeing
38:26if there's a driver and so that is
38:28what's happening in some of these videos
38:29but not with not happy what's happening
38:31all the time so I think one of the
38:35really cool things about inventing this
38:36protocol is that we can't you know we
38:37are in the Bay Area we can't find out
38:39what people all over the place are doing
38:40you know we're we notice a lot of people
38:43would say things like oh I thought it
38:45was a bad driver but then it was an
38:46anonymous car it did a good job we're
38:49like oh Oren we're in Silicon Valley you
38:51know like people are pro technology
38:53they're gonna we're trying stuff but the
38:55really great thing is by inventing this
38:57protocol and publishing it there's
38:58actually all sorts of people who have
38:59been taking it and trying these
39:00experiments in other places some places
39:02where people are less you know excited
39:04and optimistic to see autonomous cars
39:06coming in so um that wraps it up for me
39:09I will say just as a little add I am
39:10looking for graduates
39:11if anyone wants to leave their nice
39:13paying job and like me it was really
39:22great to get to see the context and the
39:24space of the two guys are working and
39:26the different types of research and
39:28design that you're doing I'm curious you
39:31shared a couple of examples specific
39:34examples of complex interactions the
39:37four-way stop the the driver with the
39:41driverless car what has been really the
39:43most challenging one can you guys think
39:45of one that's been like the most
39:47difficult to to research or to look at
39:52so from me what the vehicle should do
39:56standpoint and where it's interacting
39:59with others I don't know if there's I'm
40:01sure there's many most very large
40:03roundabouts are super interesting
40:05because of the way that you again some
40:09of the etiquette and expectation about
40:10moving all the way and being in the
40:12inside moving back out cars coming from
40:15multiple angles and and then having to
40:21get back out and sort of come in traffic
40:23off I know I'm not gonna say that we
40:24fully designed for that yet but that's
40:26the kind of thing that we explore and
40:30merge or sorry exit like if you have
40:33roads where there's some going straight
40:35and some with exits and the signaling
40:40and what happens on the road are a
40:42little bit different like these people
40:43are going to get a clearer signal but
40:45the people turning are gonna again it's
40:47very interpretive what they're gonna do
40:48so so that's what you're asking from a
40:50roadway standpoint I think that you know
40:52there's a lot of really interesting
40:55dimensions that interestingly one that
40:58we tested our intention indicator we
41:00also you know put together a vehicle
41:03didn't do quite the level of disguise
41:05that that Wendy's group has done but
41:07sort of treated as if it was an
41:09autonomous vehicle told people there's a
41:11safety driver but and in a very crowded
41:14environment which is a lot of stuff and
41:16going pedestrian actually is not very
41:18complicated because that's a easy
41:20pattern of go stop go we're all going
41:25predictable what people are doing and
41:27it's easy to again sort of do that micro
41:29negotiation and sort of recognize oh
41:31that person's parking and I'll just wait
41:33that kind of thing for me the the kind
41:38of great white whale is actually some of
41:41the stuff that Melissa was showing which
41:42is to like really pin down the
41:45differences that we know are there in
41:46driving styles in different places like
41:50you know when you rent a car you know
41:51how to drive a car that the car is the
41:53same but the moment you pull out at the
41:55airport garage everything is off all the
41:57bets are off and we have a feeling you
42:01know the headway distance the like the
42:03aggressiveness of the merge all these
42:05things are different but like actually
42:06trying to instrument things enough that
42:08you can actually like say exactly how
42:10they're different is really hard I think
42:13they were we have this cool system which
42:15was originally designed to study
42:17platooning where we actually have
42:18multiple people driving in the same
42:20virtual world and we wanted to take the
42:23whole thing go different places and
42:24study how people drive differently and
42:26even to set up comparable situations for
42:28example in the u.s. in Japan it's
42:30incredibly hard because there are so
42:32many contextual things that are
42:34different like the sign heights are at
42:35different places the buildings are
42:37everything's space differently and so
42:39when you put people in environmentally
42:41not used to driving and you actually
42:42like can't count on the behavior to be
42:44naturalistic so that's the thing that is
42:47the hardest that we're really interested
42:48in doing but we're still trying to
42:51figure out how you know one example that
42:53comes to mind for us is the pick-up and
42:56drop-off experience and so you can
42:59imagine with your live driver when you
43:01when you you know tell that tell the the
43:04service where to cut you know where to
43:05pick you up and where you want to go
43:06it's very different when you're telling
43:08a robots where to pick you up and where
43:10you want to go so there are all these
43:12little micro interactions that happen
43:13like at the pickup point you might you
43:16know signal to the to the driver that
43:18you know that you're the you're the
43:20passenger you might text or call the
43:22driver and so that's I think a big area
43:25for us we've you know we've done a lot
43:27of work and seen a lot of these
43:29situations in in Phoenix and so we're
43:31working on some features to really kind
43:33of help smooth out that experience an
43:35example would be you know if you're
43:36giving you know a lat/long point
43:39you know space for the robot to pick you
43:41up at your natural tendency is to want
43:43to you know get into the car and not
43:45have it continue you know on - it's the
43:47final destination it gets even trickier
43:50when you can imagine you know uh parking
43:52like this full of passion you know of
43:53pedestrians and other traffic and how
43:56that car navigates that so we yeah we're
43:58working on some some ideas to make that
44:01a better experience but it's it's
44:03definitely a tricky one yeah it sounds
44:05like it I think it's it's really
44:07interesting how you know you talked
44:10about all these so social norms and how
44:12we interact with people and with
44:14vehicles and then with robots and so I'm
44:18very curious in terms of you know how
44:22people you know how do you study and
44:27understand those those norms and how
44:29does that actually shift when you know
44:33when you're interacting with without a
44:34meeting with someone who's not a person
44:36right with a robot you know my favorite
44:40things and when when I hear someone in
44:41doors you say like well we can't study
44:43that and we're like well it's on yeah
44:45let's figure out how so a lot of the
44:49techniques we use are from stagecraft is
44:52how we do what we do for me I would draw
44:54from sort of the comparative method of
44:56ethnography or an original anthropology
44:58that you you know part of the reason
45:00when we go other places and get video we
45:02want comparative cases to work from to
45:04think about what's going to happen over
45:06time in the future and the importance
45:08and reason to do that now is to
45:10identifying and clarify things that
45:13might be a more fundamental significance
45:14in terms of architecture sensors
45:16capability that will need to be
45:19adaptable in future so but that
45:21comparative ability to sort of you know
45:24D familiarize yourself and look in
45:26different kinds of ways I think is a
45:28really important one and there was
45:30something else I was going to say but it
45:31slipped my mind for now I I'll let you
45:33go Ryan and then yeah for us I mean I
45:36think we're we're fortunate because we
45:38have the program up and running in
45:40Phoenix where we have people that are
45:41using the service and so we're getting a
45:43lot of our insights from those
45:45interactions just on a daily basis for
45:50or that you know one of the reasons I'd
45:52like to include that historical clip of
45:54San Francisco from 1906 is that we also
45:59have to remember why I think it's
46:00challenging to know what the future
46:02holds is the whole set of conventions
46:05and patterns is likely to change in ways
46:07that we can't really know and so you
46:10know if you have an unfortunate incident
46:11happen in Tempe Arizona people's sense
46:15of what they're looking for
46:16next is maybe beginning to adapt and
46:18again I think right now we don't have we
46:21don't have the scale of autonomous
46:23vehicles running around to know that
46:25what we see today and very much what you
46:28experience is that people don't get that
46:30much regard it's you know it's a very
46:32natural thing on the road but I don't
46:34know that we can be certain it's always
46:37going to be a very natural thing those
46:39minut differences of like how a robot
46:41might move or where we have to pay
46:44attention and once you have a lot of
46:46that where people have learned it moves
46:48a little bit different and the whole
46:49thing kind of readapt so even the
46:51question of how human-like and that has
46:54been the goal at Nissan to to develop
46:57human-like AI how human-like really
47:00should it be or is it more important
47:01ultimately for people to truthfully
47:04understand they're interacting with a
47:05robot that won't be precisely like what
47:08humans would be hopefully and the good
47:09ways of all people yeah I mean that's
47:11touching on the sort of communication
47:13mechanisms that you're looking at and
47:15understanding like you know how do we
47:18interact with these cars and what is our
47:20expectation I'm curious like you you've
47:24tried some of them you know what are
47:27what it would have been successful in
47:28that in that realm so for us for the
47:31external we've been trying well some
47:34things with one that you wouldn't do you
47:36wouldn't use text because people don't
47:37have time to read sentence also it could
47:40confuse like if you had stopping and
47:41waiting going and if there's some
47:44occlusion where people only get the ink
47:47they don't know which ink it is so you
47:49would have to be careful with text icons
47:52of course we know from design that those
47:55can be very powerful in some ways but
47:56not always mutually understood as well
47:58so we've been playing with very abstract
48:01and very simple kinds of
48:04indicators like a lighting system we've
48:07learned that it would need to be visible
48:09from the fronts and the sides it would
48:11need to operate in discreet States again
48:13because these are fast-moving dynamic
48:15situations even the very mundane ones
48:17people it's it's very different than
48:19designing for inside the car where
48:21people can give a sustained view on the
48:23street they don't can't afford to just
48:25focus in one direction too long if
48:27there's a lot going on so we're playing
48:29with that but I will also say that again
48:32without that practice already being
48:34present it's not like we're seeing you
48:36know I don't like the whole thing stops
48:37without it I mean way mo as we know does
48:39not do these sorts of things and so what
48:44it's not like people you know it's a
48:47runaway success to say we would need
48:49additional signalling so I think the
48:50question is is it a resource that will
48:52enter into the communicative practice or
48:55not but it'll need to be simple visible
48:57and yeah easy to parse yeah I mean in
49:04that space what it looks like to us but
49:06this is colored by what it is we've been
49:08able to experiment with is that people
49:10are really attuned to how the car moves
49:12and that that they're looking at that
49:14from really far away
49:15and so for example if you're walking
49:17down you know you walk to an
49:18intersection and you see a car like
49:20you're looking for the car to like slow
49:22down a little bit to notice that it sees
49:25you like the slowing down is the signal
49:27that you've been seen and that's when
49:28people feel comfortable crossing if the
49:30car doesn't slow down but kind of coast
49:32to stop people will just stand there for
49:33a really long time even if there was
49:34time to cross the road because they're
49:37just not sure about what's going on and
49:39and even if you like do stuff like you
49:42signal something but it the motion
49:44doesn't correspond with the thing that's
49:45being signal then people are just like
49:46like I'm going to cross in front of that
49:49you know and so it's really interesting
49:51because like a lot of the things that
49:53you pick up from motion happen from a
49:55much greater distance than with any of
49:58the things you can do for explicit
49:59signaling in our ghost driver experiment
50:01was actually decently difficult to get
50:03people to show up at the crosswalk at
50:05the same time as the cars because they
50:07would speed up and slow down so they
50:09could totally avoid the interaction and
50:10we were kind of surprised how much
50:12interaction avoidance people actually
50:14engage in but they do that all the time
50:16and so that that's like
50:17really interesting thing that like you
50:19have to think about how you can actually
50:20design around that work work with it
50:23fascinating and for inside communication
50:26mechanisms between the the car and the
50:29rider yeah I mean what we we do a lot of
50:34experimentation today and in trying to
50:37understand mm-hmm those moments where
50:39you might you know again like you're
50:41you're a passenger and you're you're
50:43sort of heads down maybe in your phone
50:45or you might be kind of enjoying the
50:46ride and looking out the window but
50:48there might be those moments where
50:49something might feel you might have a
50:51question or you might sort of wonder
50:52like what's the car doing and we tried
50:54to you know through this research that
50:56we're doing is uncover those moments so
50:58that we can be proactive in the
51:00communication that we're providing so
51:01then at that point it becomes you know
51:03is it something that's a little bit more
51:05you know ambient in terms of its form
51:08you know is it there if the user or the
51:10you know the writer needs it or is it a
51:12bit more direct where we might actually
51:14want to interrupt what you're doing
51:16because we want to tell you something
51:18and so again that's where Connor we're
51:20working with Connor he's a sound
51:21designer and we're thinking about you
51:24know those types of things but it's
51:25always that balance between how much do
51:27we want to interrupt you versus just be
51:29there maybe if you do have a question
51:31that's interesting and that's also
51:33related to this idea of trust right like
51:36what helps to build trust in that
51:38experience um what are people looking
51:41for like what are the the specific
51:43things that help them to gain trust well
51:46I think we try to be a transparent in
51:48terms of showing what the car can see
51:51because there are you know I often
51:53describe it as sort of superhuman you
51:55know qualities at the car that the
51:57technology has in an example of that
51:59would be the car can see 360 degrees
52:02around it unlike a human can when you're
52:04when you're driving on the car can also
52:06see three football fields away and so
52:09those are things that you know with sort
52:12of that that that vision of the world
52:14the car is that of course going to
52:15behave in a certain way and so what
52:17we're trying to do through the interface
52:19in the cars we're trying to reveal a
52:21little bit of that magic to kind of show
52:22what the car can actually see and again
52:25not not to a point where we overwhelm
52:28to kind of show how intelligent and how
52:30sophisticated the system really is
52:31that's that's what we're doing I guess
52:33the other thing in terms of trusts and
52:35again that kind of we think that's a
52:37means to an end in the end there is
52:39building trust what what I think is also
52:41interesting too is a sense of control as
52:44a passenger in the car so you start the
52:46ride when you're ready to go you know
52:48the car doesn't automatically do that
52:49you can pull over there's a pullover
52:51button and so at any time you can also
52:54tap a button and you can speak to a live
52:57writer support you know person and so
53:00again that sense of control is what we
53:02found has really helped ease or build
53:04that trust between the technology and
53:06being a passenger I'll say early in my
53:11time working on time as cars I got a
53:12ride in a Bosch autonomous car and at
53:15some point the automate the automation
53:17like failed it just like basically
53:18stopped tracking the lane properly stron
53:20280 in the Sun was a funny place but the
53:22really interesting thing was it's like
53:24we were basically veering out of the
53:25lane you can hear the bump a bump up a
53:27bump and I just felt like totally calm I
53:29really trust the car I really thought
53:32the car would take over and then the
53:33engineers like well you know hey and the
53:36thing that was interesting to me is like
53:37oh the way that the car is gonna fail
53:39and the you know is going to be
53:40different than the way that people fail
53:42and what I was really struck by was I
53:44like maybe the feeling that everything
53:47is trustworthy and safe is like maybe
53:49something dangerous that like I don't
53:51want to promote I'm gonna be really
53:53careful because the things that we look
53:55at as people to profile the driving of
53:57other people might not be the things
53:58that you actually need to look out for
54:00and in some ways promoting a sense of
54:02vigilance about the things that actually
54:03are the shortcomings of the machine are
54:05the things that will be the hardest
54:06thing to do and it's difficult because
54:08the machines fail so seldom and the way
54:11that they feel is so unlike what how we
54:13fail that that it's actually really hard
54:15to do that well the vehicle motion is
54:18the most important thing that's most
54:19communicative so that's where you're
54:22gonna get a lot of that trust or not and
54:24again I would argue that some of it is
54:26built by this thing is behaving as we
54:29kind of expect at this moment so if a
54:34car is too hesitant and pauses too much
54:36then you're like well what's it doing
54:37I'm not really sure and so it's sort of
54:41right is a big part of it the the tech
54:44world's often accused of starting with
54:47the technology and then looking for the
54:50I'm curious there seems like there's a
54:52lot of opportunity here to apply this
54:55technology in a way that would actually
54:57really improve people's lives what are
54:59the different the different ways that
55:01you guys have thought about that you see
55:03this this could actually solve some big
55:06problems I think that you know with the
55:09enough development that the kind of
55:12easing that experience of being on the
55:14road we reliefing some of the anxieties
55:18and the pressures which I think we've
55:20normalized a lot we all just accept
55:22that's how things are and so we don't
55:24wouldn't even think to voice it as a
55:26concern but as you get some of that
55:29relief from that micro attention of
55:32constantly having to be attention if you
55:35know once they get really that good and
55:37and they're out there I think that
55:38benefit to too you know the mobility
55:42experience could be very significant and
55:45also the benefits that you can get sort
55:48of ecologically by having smoother
55:50driving practices and and you know it's
55:53an opportunity to ship fleets and all
55:54sorts of ways in which you know that
55:56kind of promise but you know there's a
55:59lot of countervailing pressures that
56:01will also intrude on some of those
56:03things in things what I'm really in this
56:05for us to understand interaction with
56:06automation better and I think the car is
56:08just a good starting place I mean if
56:10you're interested in automation it's
56:11something everyone understands the
56:13driver is always sitting in a fixed
56:14location it's it's kind of a dream for
56:16doing research but the the larger issue
56:19is that like the way that we've always
56:21dealt with interacting with automations
56:22that we train the humans that are
56:24interacting with the automation to
56:25accommodate all the shortcomings of the
56:26automation I mean we have everyone's
56:28like always asking me like oh we do this
56:30with fighter pilots we do this with
56:32commercial pilots we must know how to
56:33interact with automation I'm like you
56:35don't know they just been like all this
56:36time every two years like retraining how
56:38to like deal with these like systems and
56:40so this is really where the Vanguard at
56:42the moment in time that we're dealing
56:43with the palms of automation through
56:44design and actually fixing the
56:46automation and making the automation
56:47behave properly and like I really I'm
56:50excited to be here because I want to do
56:54I mean what comes to mind for me is it's
56:57some of the changing ownership models
57:00around vehicles so when I was younger
57:02right like you know 15 and a half I
57:04would get my my driver's permit at 16 I
57:08would you know start driving and I think
57:10you know and and I had a car at 16 I
57:13think nowadays that's that's really kind
57:15of shifting and changing there's a
57:16statistic that's out there that
57:18basically says that most the time your
57:20car is sitting idle you know I have a
57:22car right now that's sitting you know
57:23parked on the street here in San
57:24Francisco it's not it's not in use and
57:26so I think that's one of the things that
57:29I get excited about is the self-driving
57:30technology you can think of it as a
57:32service and so it's really an on-demand
57:34you know when I need to get somewhere I
57:36can I can use this thing the other thing
57:38that comes to mind too is is that people
57:40with disabilities and being able to you
57:43know bring mobility or to make their
57:45lives their day-to-day lives a lot
57:46easier by having a service like we're
57:49starting up in Phoenix great I'm curious
57:55well before I hand it over to the
57:58audience for questions I'm curious if
58:01you guys have any questions for each
58:02other but you might want to pose well
58:08I'm really interested in Ryan like you
58:09how prototyping practice happens for the
58:13sort things that you're doing like where
58:14people where the design isn't set and
58:15there's a kind of a sense of play like
58:17what is the venue in which you guys get
58:19to experiment and try things out so I do
58:23I feel pretty lucky because it's it's a
58:25fairly easy process we can for stuff
58:28that we're doing in the car we are able
58:30you know we sit with our engineering
58:31team you know the design team the
58:33engineering team the researchers are all
58:35very close group and so we can we can
58:38come up with an idea for something or a
58:40concept we can work through and build it
58:41out we can get a car we can side load
58:44that into the car and we can we the
58:46easiest way is really just to kind of
58:47test it on the on a closed course that
58:49we have in Mountain View so the you know
58:52the we're at a building that used to be
58:54a former shopping mall so there are a
58:56couple parking garages that are there
58:58and so those are those are spaces that
59:00are reserved for us so we can go out on
59:03one of those in one of those areas and
59:05we can take a look at that
59:06now if it's something that's doesn't
59:07feck sort of it's this in terms of the
59:10spectrum of safety it's not something
59:13that's going to really affect the
59:14driving behavior but it might be more of
59:15an experiment with like the interface
59:17and information that we're showing
59:18that's something that again that we can
59:20do a quick build and we can get out in
59:22the car we can drive around Mountain
59:23View and we can take a look at try to
59:25trigger some of these these moments that
59:27we're trying to maybe capture are
59:28communicated on the interface and so we
59:30spend a lot of time doing that as well
59:32then we have castle is the name sort of
59:35our of what we call our our
59:37closed-circuit testing facility that's a
59:41you know a few hours outside of San
59:42Francisco where we can we're constantly
59:44running and testing different different
59:47short sorts of things but it's pretty I
59:49feel lucky as a designer we do get a
59:52chance to do things on a weekly basis
59:54you know just just in Mountain View from
59:57our you know pretty much just outside
59:59the door so I was also gonna ask you
01:00:02Orion on that some of the interfaces you
01:00:04show from the inside the question for
01:00:06the HMI that you're having to decide as
01:00:08you said how much of the information to
01:00:11surface for people so I'm curious what
01:00:14are the dimensions of making that
01:00:16decision so the images we saw you have
01:00:19you know you had other cars you had
01:00:21pedestrians and you had bicycles were we
01:00:23seeing all the cars or are you you know
01:00:26leaving some out like the cross traffic
01:00:28cars that aren't necessarily um the
01:00:33implications of them are different for
01:00:35the the vehicle that might be waiting if
01:00:36there's cross traffic and then also
01:00:39whether you have found it important to
01:00:42engage images of different types of
01:00:45vehicles it looked like there was kind
01:00:47of one type or do you see them all the
01:00:49same or and then a bit more the thinking
01:00:51behind you know as you play with those
01:00:53sort of thresholds what are the factors
01:00:55that lead you to make those decisions
01:00:56yeah so we don't do a lot of filtering
01:00:59in terms of what you're actually seeing
01:01:01in that scene so you pretty much are
01:01:02seeing what the car is sensing but when
01:01:04I joined the team for example the the
01:01:06where we used to render the nearby
01:01:09traffic using the actual laser points
01:01:12and so the way that they would show up
01:01:14is they would they you'd see sort of
01:01:16cars that looked sort of half rendered
01:01:18in the UI it's because you know when you
01:01:20the way that the lighter works that you
01:01:22know you're seeing sort of you know half
01:01:23of the vehicles and so to me is a new
01:01:26designer coming onto the team it did
01:01:28feel a little bit weird like I was
01:01:29looking at a UI that was kind of you
01:01:31know like half rendered and half not and
01:01:33my concern was that as we you know kind
01:01:36of go out to a larger you know sort of
01:01:39consumer audience that that UI might
01:01:41look more like an advanced UI or sort of
01:01:43like a something that you might see like
01:01:45in the gaming space Radiohead I think
01:01:47did a video years ago where they just
01:01:49shot it all with lidar and it looks
01:01:51really cool but for like you know my mom
01:01:53you know it might look a little bit
01:01:55strange and so that's the reason why
01:01:57we've taken like with the traffic we we
01:02:00use these very simple shapes and for now
01:02:02we don't differentiate between like a
01:02:03truck versus like a car or a big bus and
01:02:07other than using size so we do use size
01:02:09but we keep those shapes pretty standard
01:02:12across different types of traffic that's
01:02:14one example we again big I guess the
01:02:17other example would be but when it comes
01:02:19to people the laser points work really
01:02:22well because there's again there's like
01:02:23a little bit of a magic moment where you
01:02:25can see arms and legs that are moving
01:02:26and to try to represent people as some
01:02:29others you know try some iconic
01:02:32representation it just would look kind
01:02:34of weird in that scene and so that's why
01:02:36we mix in the laser points there that
01:02:40hi I'm Joanne I had a question for the
01:02:45see yeah I had a question for the panel
01:02:47um since it seems like autonomous
01:02:49driving with driving isn't an
01:02:51all-or-nothing kind of situation right
01:02:52so you have different car you know you
01:02:55have this rate of cars that you know
01:02:57it's slow at first on the road and then
01:02:59you'll have more and more you have
01:03:00different design approaches right to the
01:03:02form factor of the car other cars look
01:03:04just like any other cars so considering
01:03:08the fact that you have this mix of
01:03:09technologies autonomous technologies on
01:03:11the road do you think there is going to
01:03:13be some sort of convergence towards one
01:03:15kind of design choice like do you make
01:03:17it look overtly like a self-driving car
01:03:19should it just be like any other car on
01:03:21the road is there a need to standardize
01:03:24these external kind of markings of what
01:03:26makes that autonomous like you put a
01:03:28badge on the car a lighting element or
01:03:30something else because maybe the sensors
01:03:32aren't going to be quite as obvious
01:03:34in the future so I think the verdict is
01:03:40still out on those and those are exactly
01:03:42the questions being asked about what
01:03:45degree of conventions to harmonize I
01:03:48think that we can at least say that
01:03:50there needs to be a certain degree of
01:03:51harmonization about things like the
01:03:54movement patterns and if there are
01:03:55additional signals added that those need
01:03:58to be interpreted well across all the
01:04:00brands and the different that makes and
01:04:01models you know speaking from somebody
01:04:05from a car company we're assuming many
01:04:08makes and models and that it won't be
01:04:10just one format and I don't know that
01:04:13you need to have just one format of a
01:04:15vehicle in terms of actual you know
01:04:17cabin style and all that but the
01:04:23question again of the kind of marking
01:04:25and will it identify itself as a robotic
01:04:28car and if we're fully autonomous then
01:04:31you know that might be a little bit
01:04:32easier in the world until we get all the
01:04:35way to fully autonomous is a marking of
01:04:37an autonomous vehicle or an autonomous
01:04:40vehicle that's currently in autonomous
01:04:42mode because there might be times it's
01:04:44not so there's other kinds of questions
01:04:46are so in exploration unless you guys
01:04:48know better gonna say that there's this
01:04:54professor at CCS in Detroit pulp and
01:04:57Garo and he says the ethical imperative
01:04:59for designers is always to increase
01:05:00choice and I think to some degree in
01:05:03order for things to go you you want some
01:05:05standardization some harmonization but
01:05:07what I really fear as designer is that
01:05:09we kind of prematurely collapse onto a
01:05:11few things before we really have
01:05:13explored the space and understand what's
01:05:14gonna work well and when I do my
01:05:16research one of the things I'm
01:05:17constantly trying to promote is this
01:05:19understanding that autonomy can come out
01:05:21many different ways and I think in some
01:05:24ways because we use like these like you
01:05:27know the cardboard and tape and string
01:05:29ways of prototyping the autonomy it's
01:05:31easier for us to show all the different
01:05:33ways that you can make things different
01:05:35but I really want people to be aware of
01:05:37that because there's still we're still
01:05:38in a moment in time where there's a lot
01:05:39of options for how we pursue autonomy
01:05:42and like people are always trying to
01:05:43tell me why it has to be level for your
01:05:45has to be level for and I try to explain
01:05:47all these things can be implemented in
01:05:49different ways and have different
01:05:50consequences but let's actually explore
01:05:52it instead of just trying to make up our
01:05:53mind before we have any information yeah
01:05:58I mean I completely agree with that too
01:06:00I mean way mo we definitely believe in
01:06:03that idea of choice and we see a world
01:06:05where there is room to have a lot of
01:06:07different types of vehicles on the road
01:06:09so we announced you know a few weeks ago
01:06:11around the the trucking stuff that we're
01:06:12doing we announced earlier this week the
01:06:14partnership that we're doing with which
01:06:16a guar so you could imagine like hey you
01:06:18might want to take the family to the
01:06:20mall or to the movies and you might that
01:06:22might work well with the with the
01:06:23minivan that we have or maybe you're
01:06:25going on a date you want something a
01:06:26little bit nicer and so you can kind of
01:06:28go for the luxury option but so at least
01:06:30in the near term it's very much a world
01:06:33where I think the more choice the the
01:06:35better hi my name is Mia I'm a product
01:06:38designer at lyft my question is so
01:06:41oftentimes in ride-sharing experiences
01:06:44the passenger will make decisions or
01:06:46suggest decisions to the driver for
01:06:49example don't go on the freeway during
01:06:50rush hour I take them side road how does
01:06:52how did those kind of nuanced
01:06:54decision-making factored into the
01:06:57experience between the passenger within
01:06:59the car and the car and then my second
01:07:03question is as you think about level 5
01:07:05autonomous and what's needed within
01:07:08society how do you think about where
01:07:11society is today and how to get to that
01:07:14Northstar so one of the things that's
01:07:18interesting from human-robot interaction
01:07:20is very often we designed things with
01:07:22will be called the H model which is you
01:07:23design a human human interaction you
01:07:24observe a human human interaction and
01:07:26then you model the robot interaction but
01:07:28one of the things has been coming out
01:07:29more recently is that people don't treat
01:07:32machines with the same social status as
01:07:34as other people so they particularly get
01:07:38very crabby when machines dictate
01:07:40decisions and so I talked to all sorts
01:07:43of people in the industry who aren't
01:07:44very aware of that and they're like the
01:07:45car is a better driver than the person
01:07:48so the car should always Trump you know
01:07:50and like that the car should just go the
01:07:52right route or the car should just do
01:07:53things and I don't think they understand
01:07:55that people will revolt when that
01:07:56happens because it's a machine making a
01:07:58decision for you and that feeling that
01:08:00you don't actually have
01:08:01autonomous you know agency you know of
01:08:04your body is an incredibly difficult one
01:08:06for people to deal with I mean I would
01:08:12just add again you know along with
01:08:14communication control is something that
01:08:16we think is really important to allow
01:08:18for a passenger that's in a in a car you
01:08:21know by that's that's driven by um by
01:08:24the way most technology so whether it's
01:08:26you know just the simple things like
01:08:28being able to start the ride you know
01:08:30when you're ready as a passenger being
01:08:31able to pull over being able to to get
01:08:34some help it's get an actual person you
01:08:37know on the on the call with you to ask
01:08:40a question those are all like you know
01:08:42degrees of control so I think it's it's
01:08:44it's something that's very important a
01:08:49little bit on that last question as well
01:08:51it is a very hard one I think but I
01:08:54guess I'm questioning a little bit that
01:08:55you know the sort of Northstar they have
01:08:57a Holy Grail kind of idea that you have
01:08:59there I think it's still even as a
01:09:01researcher even as a car company person
01:09:04to continue to ask the really what are
01:09:06we getting why do we think we have to
01:09:08get to that what what is the ultimate
01:09:10benefit and what is the advantage
01:09:12because autonomy is not a one-way street
01:09:15you know and if you look it like at what
01:09:17autonomy means it's again it's a fairly
01:09:20interpretive notion and it's always
01:09:22relational it's it's freedom from
01:09:24something or a liberty to do something
01:09:27it's it's some agent visa vie something
01:09:30else that's controlling or or or
01:09:33stopping or you know oppressing or
01:09:36whatever it's doing and so that it's
01:09:38relational so what is it we're getting
01:09:40freedom from and even you know there's a
01:09:44lot of people are quite critical of the
01:09:46levels model because in practices with a
01:09:50lot of autonomy you go back and forth I
01:09:52mean we have you know anti-lock brakes
01:09:54right there's there's autonomy with that
01:09:56but that doesn't mean it's a full
01:09:58autonomy so sometimes you're gonna
01:09:59engage sometimes you're not so I so it
01:10:03was just that little comment on it's
01:10:05actually not a straightforward notion to
01:10:08even understand really what is that
01:10:10level five and you know is that all the
01:10:12time forever always and when are the ins
01:10:15outs of it I guess it's more of a
01:10:17comment than a question sorry but like
01:10:21like kind of using computer vision or
01:10:23biometrics to like sense when when the
01:10:26passenger is high-stress or like when
01:10:29their heartbeat elevates or something or
01:10:31something they think something is going
01:10:33wrong with a car react to them reacting
01:10:36kind of is that something that you're
01:10:39exploring or I'll say just real quickly
01:10:42that there's a lot of driver bonding
01:10:43kinds of capability and I don't know how
01:10:46much you know when Mo's using but to get
01:10:48a strong sense of what the driver might
01:10:51be doing or the person formerly known as
01:10:53driver and the passengers you could use
01:10:55haptic some in in you know in seats you
01:10:59know on the floor to like get a sense
01:11:02are people doing breaking you know as
01:11:04the body tensing so there's a lot of
01:11:05work to look at those kinds of things
01:11:07and then Nissan has come out there's
01:11:09some been some research and what it
01:11:12showed it CES this year is a brain to
01:11:15vehicle technology which is a cap that
01:11:18is I didn't do this research I shouldn't
01:11:21actually be speaking about it because I
01:11:23can tell you about this much about it
01:11:24but but go look it up it's very
01:11:26interesting that is sensing again it's
01:11:29getting in the very you know initial or
01:11:33prior sort of brain activity waves just
01:11:37to sort of adjust about kind of
01:11:39preferences and comfort it's not to ever
01:11:42control or like second-guess but so
01:11:45there's a lot of work in other words
01:11:47that on those kinds of things I have a
01:11:52student who's working on a lot of pilot
01:11:54uh biometric sensing for his PhD and I
01:11:57think that's what it's appropriate for
01:11:59right now it's like cutting-edge
01:12:01research that barely works and I want
01:12:03people to understand that I think if you
01:12:05want to know how people feel the gold
01:12:09standard is you ask people how they feel
01:12:10and this is knowing that people lie you
01:12:12know like this obscure the truth people
01:12:14do all sorts of things to signal how
01:12:15they feel to one another and those
01:12:17things that people you actively project
01:12:19to signal I think are better a lot more
01:12:21reliable indicators than these things
01:12:24like your heart rate or you so just for
01:12:27example like there's also the reason why
01:12:28your heart rate go up you could because
01:12:29you know you could be stressed and like
01:12:31what valence that arousal level has and
01:12:33exactly what is the cause of that like
01:12:35that that's something that you can't
01:12:36know just by hooking up you know some
01:12:39sort of like sensitive person and so you
01:12:41really have to talk to people I'm
01:12:43curious level four where it is highly
01:12:46autonomous not fully autonomous and
01:12:48people still have to intervene from time
01:12:50to time for edge cases like what are the
01:12:53realistic design considerations and
01:12:55expectations for riders when you know in
01:12:58the sudden moment that they need to may
01:13:01have to make like some smart decisions
01:13:04when they may not be expecting to
01:13:08interact I think that I don't know
01:13:12anybody who is hoping that the a person
01:13:15will jump in with no time to spare
01:13:17because the person can't react as fast
01:13:20as the car can to any situation but
01:13:22scene complexity is incredibly hard for
01:13:25machines and super easy for people and
01:13:27so what you want to do is actually have
01:13:30the car like doing a good job
01:13:32visualizing what the situation is and
01:13:33for the person to realize that the car
01:13:36can't see the situation well for example
01:13:38and that's a couple more like the sort
01:13:39of situation I think in level four
01:13:41people are hoping the driver what will
01:13:43be trying to intervene not like here you
01:13:46go you drive in you know you just pass
01:13:48over the steering wheel but just to go
01:13:51back there's no one way that we've all
01:13:53agreed that level four is going to
01:13:54happen like all sorts of people working
01:13:56on all sorts of things and and one of
01:13:58the things that we're sorting out right
01:14:00now like what are the options you know
01:14:02what is the right way to do it not like
01:14:03should we do this Melissa I loved your
01:14:05comment about ours at the end that
01:14:09everyone in the world is a passive user
01:14:11of an autonomous vehicle as its present
01:14:13and I imagine today as autonomous
01:14:16vehicles hit the road they need to
01:14:17signal and behave like a like a
01:14:19predictable human driver just to
01:14:22integrate but then if we fast forward
01:14:25and autonomous cars are only on the road
01:14:28with other autonomous cars they would
01:14:30theoretically have their own language
01:14:34and I've that's the place we ideally get
01:14:37to and we're all just passengers all the
01:14:39time there's this it seems like there's
01:14:43soon as a humanist present it needs to
01:14:45behave differently and signal
01:14:47differently and so there's I guess
01:14:52conflict for me in in the car making a
01:14:55radical shift from speaking only to
01:14:57robots to speaking to humans as soon as
01:14:59they're present does that pose any sort
01:15:01of challenge or is that a consideration
01:15:03the first thing again I think the
01:15:06premise of that they'll ever be where
01:15:08there's no humans then I wonder what the
01:15:10heck we've done like well what are they
01:15:12doing on the road without us because
01:15:14aren't they for us and you know isn't
01:15:16the point that we get to get in them or
01:15:18use you know put things inside and out
01:15:20of them and and you know and do bicycles
01:15:23go away in motorcycles and all those
01:15:25other kinds of things but of course
01:15:27there's very clever ways that
01:15:29infrastructure can be used to also sort
01:15:31of separate so if we get to where the
01:15:33vehicles can be just communicating
01:15:35themselves and the only time that humans
01:15:37are interacting with them are in
01:15:39on-boarding and off-boarding then then
01:15:42we have that kind of scenario and maybe
01:15:43the bikes are next door so so then I
01:15:46think you know it'll be an evolution so
01:15:49I'm not sure that you would have a
01:15:50sudden jump that it then they interact
01:15:52differently together
01:15:54right now there's as Wendy is making a
01:15:57great point of saying how many different
01:15:59things are going on of course people are
01:16:00already looking at vehicle to vehicle
01:16:02kinds of communication and what those
01:16:04what that offers and vehicle to
01:16:06infrastructure kinds of communication so
01:16:08I think that that that evolution is
01:16:10happening together with it so unless
01:16:13I've misunderstood your question I'm not
01:16:15sure that it necessarily becomes a
01:16:17radical disjuncture but the little
01:16:19things that would happen yes they can
01:16:21drive much closer together they can go
01:16:23at a consistent speed maybe a higher
01:16:24speed you know the opportunity that even
01:16:27cities and streets can change because
01:16:30you don't need the same amount of space
01:16:31and all those sorts of things is maybe
01:16:35where it begins to get like oh this is
01:16:37now looking and feeling and acting very
01:16:40differently yeah so those are some
01:16:42thoughts you could add sure I mean one
01:16:46thing I think is exciting as designers
01:16:48is that we deal with path dependencies
01:16:50that your design is always coming into
01:16:52the context where there's other things
01:16:54before and so there are situations where
01:16:57the way that you design things I don't
01:16:59have a path dependency what you're using
01:17:00existing roads with already other users
01:17:02you have to push them off if you want it
01:17:04to have a future where you're doing what
01:17:06you're talking about I mean there's
01:17:08already a lot of automation in
01:17:09construction in agriculture and these
01:17:12are situations where there's more like
01:17:15ability to kind of control every single
01:17:16thing in the environment so I think the
01:17:19thing that's interesting about like
01:17:21on-road automation is the ability to
01:17:23engage with all the other things that
01:17:24are going on and what I really don't
01:17:26want is a future where we solved the
01:17:28problems by making it easier for the
01:17:30automation by pushing everyone else off
01:17:32the road but that's just kind of more
01:17:34political personal take on things thank
01:17:37you so much for the discussion maybe I
01:17:40was curious whether a new research you
01:17:42saw any use of Rebell or sort of abuse
01:17:46the system's the self-driving cars and
01:17:49what does that look like and what can we
01:17:52do about it in my short time my short
01:17:57two years I can't say that I have seen
01:18:01anything personally I've heard of
01:18:02stories you know a difference I don't
01:18:04know if it was actually Google's cars
01:18:07but I've heard of people having
01:18:08different reactions you know pedestrians
01:18:10to car you know where there might be you
01:18:14know some sort of an interaction there
01:18:16but I can't I don't know I can't speak
01:18:17to like a specific moment or anything we
01:18:24we ran an experiment that was a wild
01:18:26card experiment that the car was meant
01:18:28to look like an autonomous vehicle with
01:18:30a safety driver and and we certainly had
01:18:34we had one person very actively clowning
01:18:36around and trying to you know mess with
01:18:38things and I think the Quai and you know
01:18:41there's been reports of that kind of
01:18:43thing happening here with the testing
01:18:44that's going on in San Francisco the
01:18:47question I have about some of that is
01:18:49how much of that I mean well the
01:18:50potential for gaming of course is a very
01:18:52real one that people think not like the
01:18:54malicious gaming but just so I can take
01:18:56advantage of the car because it'll stop
01:18:58for me so I can just keep going and not
01:19:00worry that's a very real consideration
01:19:02but the more active like rebelling to do
01:19:05things to disrupt it or interrupt it the
01:19:10question I think I would have
01:19:11how much of that is a novelty effect and
01:19:13how much of that will be a persistent
01:19:15kind of thing but I think as a novel
01:19:17feature yeah I'm sure there's probably
01:19:20more cases and get reported and we've
01:19:24only seen one that I can think of all
01:19:26right well thank you guys for all your
01:19:27fantastic questions and thank you for
01:19:30joining us that was really fantastic and
01:19:33hope you'll join us next month