00:06in just a few minutes um elon musk will
00:08be joining us here live on stage
00:12for a conversation uh
00:14rumor has it there are a few things to
00:24i just want to show you something
00:29i want you to come with me to
00:31tesla's huge gigafactory in austin texas
00:35so the day before it opened last week
00:37the evening before i was allowed to walk
00:42and what i saw there was honestly pretty
00:46this is elon musk's famous machine that
00:49builds the machine and his view the
00:51secret to a sustainable future is not
00:54just making an electric car
00:55it's making a system that churns out
00:58huge numbers of electric cars with a
01:00margin so that they can fund further
01:04when i was there um none of us knew
01:07whether elon would actually be able to
01:08make it here today so i took the chance
01:10to sit down with him and record an epic
01:14and i just want to show you
01:18an eight minute excerpt of that
01:20interview so here from austin texas elon
01:24i want us to switch now to think a bit
01:25about artificial intelligence i i'm
01:28curious about your timelines and how you
01:30predict and how come some things are so
01:32amazingly on the money and some aren't
01:34so when it comes to predicting
01:36sales of tesla vehicles for example
01:39i mean you've kind of been amazing i
01:41when tesla had sold that year 60 000
01:45cars you said 2020 i think we will do
01:48half a million a year yeah we did almost
01:50exactly half a million five years ago
01:52last time you came today we um i asked
01:55you about for self-driving and um you
01:57said yep this very year where i am
02:00confident that we will have a car going
02:03from la to new york uh without any
02:06intervention yeah i i don't want to blow
02:08your mind but i'm not always right
02:12so talk what's the difference between
02:15why has full self-driving in particular
02:17been so hard to predict i mean the thing
02:20that really got me and i think it's
02:21going to get a lot of other people is
02:24there are just so many false stones with
02:26with self-driving um where you think you
02:30think you've got the problem
02:32have a handle on the problem and then it
02:33nope uh it turns out uh you just hit a
02:37ceiling um and and uh
02:40because what happened if you if you were
02:42to plot the progress
02:44the progress looks like a log curve so
02:47a series of log curves so
02:49uh most people don't like cookies i
02:50suppose but it shows the show
02:53it goes it goes up sort of a you know
02:55sort of a fairly straight way and then
02:57it starts tailing off right and and and
02:59you start there's a kind of ocean
03:00getting diminishing returns you know in
03:02retrospect they seem obvious but uh in
03:04in order to solve uh full self-driving
03:06uh properly you actually just you have
03:08to solve real-world ai
03:10um you you you know because you said
03:13what are the road networks designed to
03:15to work with they're designed to work
03:17with a biological neural net our brains
03:20and with uh vision our eyes
03:24and so in order to make it work
03:28with computers you basically need
03:31to solve real world ai
03:35because because we we need
03:40and silicon neural nets
03:42uh in order to have to have
03:45self-driving work for a system that was
03:46designed for eyes and biological neural
03:50it you know when you i guess when you
03:52put it that way it's like quite obvious
03:54that the only way to solve full
03:56self-driving is to solve real-world
03:58ai and sophisticated vision what do you
04:01feel about the current architecture do
04:02you think you have an architecture now
04:04where where there is
04:06a chance for the logarithmic curve not
04:08to tail off any anytime
04:13admittedly these these uh may be an
04:15infamous uh last words but i i actually
04:18am confident that we will solve it this
04:20uh that we will exceed uh
04:22you're like what the probability
04:25of an accident at what point should you
04:27exceed that of the average person right
04:29um i think we will exceed that this year
04:32talking again in a year it's like well
04:34yeah another year went by and it didn't
04:36happen but i think this i think this is
04:37the year is there an element that you
04:39actually deliberately
04:40make aggressive prediction timelines to
04:44drive people to be ambitious and without
04:46that nothing gets done
04:48so it's it feels like at some point in
04:52seeing the progress on
04:55understanding you that you're that the
04:57ai the tesla ai understanding the world
04:59around it led to a kind of an aha moment
05:02in tesla because you really surprised
05:03people recently when you said
05:06probably the most important product
05:08development going on at tesla this year
05:10is this robot optimus yes
05:13is it something that happened in the
05:15development of fourself driving that
05:16gave you the confidence to say you know
05:18what we could do something special here
05:21yeah exactly so you know it took me a
05:23while to sort of realize that that in
05:26order to solve self-driving you really
05:28needed to solve real-world ai um
05:31at the point of which you solve
05:32real-world ai for a car which is really
05:34a robot on four wheels uh you can then
05:37generalize that to a robot on legs as
05:41the thing that the things that are
05:43currently missing are uh
05:45enough intelligence enough to tell
05:47intelligence for the robot to navigate
05:48the real world and do useful things
05:50without being explicitly instructed it
05:53is so so the missing things are
05:54basically real world uh intelligence and
05:57uh scaling up manufacturing um those are
06:00two things that tesla is very good at
06:03uh so then we basically just need to
06:05design the the uh specialized actuators
06:08and sensors that are needed for a
06:11people have no idea this is going to be
06:16um but so talk about i mean i think the
06:18first applications you you've mentioned
06:20are probably going to be manufacturing
06:21but eventually the vision is to to have
06:23these available for people at home
06:25correct if you had a robot that really
06:28understood the 3d architecture of your
06:33knew where every object in that house
06:36was or was supposed to be and could
06:39recognize all those objects i mean that
06:42that's kind of amazing isn't that like
06:44like that the kind of thing that you
06:45could ask a robot to do
06:47would be what like tidy up yeah um
06:51or make make dinner i guess mow the lawn
06:54take take a cup of tea to grandma and
06:57show her family pictures and
06:59exactly take care of my grandmother and
07:02make sure yeah exactly and it could
07:04recognize obviously recognize everyone
07:06in the home yeah could play catch with
07:08your kids yes i mean obviously we need
07:10to be careful this doesn't uh become a
07:14um like i think one of the things that's
07:16going to be important is to have a
07:18localized rom chip on the
07:21robot that cannot be updated over the
07:23air uh where if you for example were to
07:26say stop stop stop that would if anyone
07:28said that then the robot would stop you
07:31and that's not updatable remotely um i
07:34think it's going to be important to have
07:35safety features like that
07:37yeah that that sounds wise and i do
07:39think there should be a regular free
07:40agency for ai i've said this for many
07:42years i don't love being regulated but i
07:44you know i think this is an important
07:45thing for public safety do you think
07:47there will be basically like in say 2050
07:50or whatever that like a
07:52a robot in most homes is what they will
07:57you'll have your own butler basically
07:59yeah you'll have your sort of buddy
08:02probably yeah i mean how much of a buddy
08:06how many applications you thought is
08:07there you know can you have a romantic
08:13i mean i did promise the internet that i
08:14would make cat girls we'll have we could
08:19how are you because yeah
08:25yeah i i guess uh it'll be what whatever
08:27people want really you know so what sort
08:30of timeline should we be
08:32thinking about of the first the first
08:34models that are actually made and
08:39you know the the first units that that
08:44for jobs that are dangerous boring
08:46repetitive and things that people don't
08:48want to do and you know i think we'll
08:49have like an interesting prototype uh
08:52sometime this year we might have
08:54something useful next year but i think
08:56quite likely within at least two years
08:59and then we'll see rapid growth year
09:01over year of the usefulness of the
09:02humanoid robots um and decrease in cost
09:05and scaling out production
09:07help me on the economics of this so what
09:09what do you picture the cost of one of
09:10these being well i think the cost is
09:12actually not going to be uh crazy high
09:16like less than a car yeah but but think
09:18about the economics of this if you can
09:23thirty thousand dollar forty thousand
09:24dollar a year worker
09:26which you have to pay every year
09:28with a one-time payment of twenty five
09:30thousand dollars for a robot that can
09:35doesn't go on vacation i mean that could
09:37it could be a pretty rapid replacement
09:40of certain types of jobs how worried
09:43should the world be about that
09:44i wouldn't worry about the the sort of
09:46putting people out of a job thing um i
09:49we're actually going to have and already
09:51do have a massive shortage of labor so i
09:53i i think we'll we will have um
09:59not not people out of work but actually
10:01still a shortage labor even in the
10:06this really will be a world of abundance
10:09any goods and services
10:11uh will be available to anyone who wants
10:13them that it'll be so cheap to have
10:15goods and services it'll be ridiculous
10:27an epic 80 minute interview
10:29which we are releasing to people
10:33members of ted 2022 right after this
10:35conference um you should be able to
10:38look at it on the ted live
10:42um there's public interest in it we're
10:44putting that out to the world on sunday
10:47afternoon i think sunday evening but uh
10:49but if you're into this kind of stuff um
10:51definitely a good thing to do over the
10:55hearing from elon live there's there's
10:57huge public interest in that we have
11:00opened up this segment to live stream
11:04we're joined right now by i think quite
11:07a few people around the world um welcome
11:09to vancouver welcome to ted 22 you're
11:11joining us on the last day of our
11:12conference here in a packed
11:17we've been hearing all week from
11:19people with dreams about what the next
11:23of humanity is going to be
11:28the biggest visionary of them all elon
11:46so elon um a few hours ago
11:51an offer to buy twitter
12:06tweeted in my ear or something i don't
12:09by the way have you seen the movie ted
12:11i i i have i have a movie
12:23was there a question
12:25why why make that offer oh so
12:28well i think it's very important for
12:33an inclusive arena for
12:43twitter has become kind of the de facto
12:49it's just really important that people
12:51have the both the uh the reality and the
12:56uh that they're able to speak freely
12:59within the bounds of the law um
13:03you know so one of the things that i
13:04believe twitter should do is open source
13:08um and make any changes
13:11uh to people's tweets you know if
13:12they're emphasized or de-emphasized uh
13:15that action should be
13:17made apparent so you anyone can see that
13:19action has been taken
13:21so there's there's no sort of behind the
13:24manipulation either algorithmically or
13:31last week when we spoke elon um i asked
13:34you whether you were thinking of taking
13:36over you said no way said i i do not
13:38want to own twitter it is a recipe for
13:40misery everyone will blame me for
13:43everything what on earth changed no i
13:45think i think everyone will still blame
13:48yeah if something if if i acquire
13:50twitter and something goes wrong it's my
13:54i i think there will be quite a few
13:55arrows uh yes um it will it will be
13:58miserable but you still want to do it
13:59why i mean i hope it's not too miserable
14:05i i just think it's important to the fun
14:10it's important to the function of
14:14it's important to the function of
14:17uh the united states uh as a free
14:19country and many other countries and to
14:23freedom in the world
14:24more broadly than the u.s
14:32you know i think this there's the risk
14:36uh is decreased if twitter
14:39the more we can increase the trust of
14:41twitter as a public platform and so
14:45i do think this will be somewhat painful
14:47and i'm not sure that i will actually be
14:48able to to acquire it
14:50and i should also say
14:55retain as many shareholders as is
14:57allowed by the law in a private company
14:59which i think is around
15:012000 or so so we'll it's not like it
15:03it's definitely not not from the
15:05standpoint of letting me figure out how
15:06to monopolize or maximize my ownership
15:09but we'll try to bring along as many
15:11shoulders as we right as we're allowed
15:13to you don't necessarily want to pay out
15:1540 or whatever it is billion dollars in
15:17cash you you'd like them to come come
15:21i mean i could technically afford it um
15:28what i'm saying is this this is
15:30this is uh this is not a
15:32way to sort of make money
15:34you know i think this is it's just that
15:37i think this is um this could
15:40strong intuitive sense is that uh having
15:44a public platform that is maximally
15:50and broadly inclusive
15:52um is extremely important to the future
15:55of civilization but you've described
15:58yourself i don't care about the
16:00okay that's that's core to hear you this
16:03is not about the economics it's for the
16:05the moral good that you think will
16:07achieve you you've described yourself
16:08elon as a free speech absolutist
16:12does that mean that there's literally
16:13nothing that people can't say and it's
16:20obviously uh twitter or any forum is
16:23bound by the laws of the country that it
16:28obviously there are some limitations on
16:31free speech uh in in the us and and of
16:36twitter would have to abide by those uh
16:38right rules so so so you can't incite
16:40people to violence like that that the
16:43like a direct incitement to violence you
16:46know you can't do the equivalent of
16:47crying fire in a in a movie theater for
16:50example no that would be a crime yeah
16:53it should be a crime but here's here's
16:55the challenge is is that it's it's such
16:57a nuanced difference between different
17:02there's excitement to violence yeah
17:04that's a no if it's illegal um there's
17:06hate speech which some forms of hate
17:08speech are fine you know i hate spinach
17:13i mean if it's a sauteed in a
17:16you know cream sauce that would be quite
17:20but the problem is so so so let's say
17:22someone says okay here's one tweet i
17:24hate politician x yeah next tweet is i
17:27wish polite politician x wasn't alive
17:30as we some of us have said about putin
17:32right now for example so that's
17:36another tweet is i wish politician x
17:38wasn't alive with a picture of their
17:40head with a gun sight over it
17:43or that plus their address i mean at
17:48someone has to make a decision as to
17:50which of those is not okay can an
17:54do that well surely you need human
17:55judgment at some point
18:02uh twitter should um
18:05match the laws of the of the country of
18:07and and and really you know
18:09that there's an obligation to to do that
18:14but going beyond going beyond that um
18:18unclear who's making what changes to who
18:22uh having tweets sort of mysteriously be
18:25promoted and demoted
18:27with no insight into what's going on uh
18:29having a black box algorithm uh promote
18:32some things and other not not other
18:33things i think this can be quite
18:36so so so the idea of opening the
18:38algorithm is a huge deal and i think
18:40many people would would welcome that of
18:42understanding exactly how it's making
18:44the decision and critique it and
18:46critique like i want to improve what
18:48wondering is like like i think like the
18:50code should be on github you know so
18:52then uh and so people can look through
18:54it and say like i see a problem here i
18:57don't i don't agree with this
18:59they can highlight issues right um
19:02suggest changes in the same way that you
19:04sort of update linux or or signal or
19:06something like that you know but as i
19:08understand it like at some point right
19:12what the algorithm would do is it would
19:14look at for example how many people have
19:15flagged a tweet as obnoxious
19:21at some point a human has to look at it
19:22and make make a decision as to does this
19:25cross the line or not that the algorithm
19:27itself can't i don't think yet um tell
19:30the difference between
19:32legal and okay and and definitely
19:34obnoxious and so the question is which
19:36humans you know make make that
19:39core i mean do you have do you have a
19:40picture of that right now twitter
19:43and facebook and others you know they've
19:45hired thousands of people to try to help
19:48make wise decisions and the trouble is
19:50that no one can agree on on what is wise
19:52how do you solve that
19:55well i i i think we would want to er on
20:01let let the speech that let it exist uh
20:04it would have you know if it's a
20:09a gray area i would say let let the
20:14obviously you know in a case where
20:16there's perhaps a lot of controversy uh
20:19that you would not want to necessarily
20:21promote that tweet if uh you know so the
20:25i'm not i'm not saying this is that i
20:27have all the answers here um
20:30i i do think that we want to be just
20:33very reluctant to delete things
20:37just just be very cautious with with
20:39with permanent bands uh you know
20:42timeouts i think are better or uh than
20:45sort of permanent bands
20:49but just just in general like i said
20:53uh how how it won't be perfect but i
20:56think we wanted to really uh have
20:59like so the possession and reality that
21:02free as reasonably possible
21:04and a good sign as to whether there's
21:11don't like allowed to say something you
21:15and if that is the case then we have
21:16free speech and it's it's damn annoying
21:19when someone you don't like says
21:21something you don't like
21:22that is a sign of a healthy functioning
21:26free speech situation
21:31i think many people would agree with
21:33that and look at the reaction online
21:34many people are excited by
21:36you coming in and the changes you're
21:37proposing some others are absolutely
21:39horrified here's how they would see it
21:41they would say wait a sec we agree that
21:43that twitter is an incredibly important
21:45town square it is a it is you know where
21:47the world exchanges opinion about life
21:50how on earth could it be owned by the
21:52world's richest person that can't be
21:55so how how do you i mean what's the
21:57response there is there any way that you
22:00distance yourself from the actual
22:02decision-making that matters on content
22:06in some very clear way that is
22:08convincing to people
22:10well like i said i think the
22:13it's it's very important that like the
22:15the algorithm be open sourced and that
22:17any manual uh adjustments be
22:20uh identified like so if this tweet if
22:23somebody did something to a tweet it's
22:25there's information attached to it that
22:27this that action was taken and i i i i
22:30won't personally be uh you know in their
22:36but you'll know if something was done to
22:39promote demote or otherwise affect uh a
22:46media sort of ownership i mean you've
22:48got you know um mark zuckerberg owning
22:52instagram and whatsapp um
22:54and with a share ownership structure
22:58mark zuckerberg the 14th still
23:11what's that need we won't have that on
23:13if if you commit to opening up the
23:15algorithm that that definitely gives
23:17some level of confidence um talk about
23:20talk about some of the other changes
23:21that you've proposed so you
23:24at the edit button that's that's
23:26definitely coming if you if you have
23:29and how do you i mean i i think
23:38top priority i have i would have is is
23:41eliminating the the spammings and scam
23:45and the bot armies that are on twitter
23:51you know i think i think these these fun
23:56they're not they're they're they they
23:57make the product much worse
24:00um if i see if you know
24:02if i had a dogecoin for every crypto
24:10more you know 100 billion dollars
24:13do you regret sparking the sort of storm
24:16of excitement overdose and you know
24:20i mean i think deutsche is fun and you
24:23know i've always said don't bet the form
24:29but i i think i think it's it's i like
24:32dogs and i like memes and uh it's got
24:37but just on the on the edit button how
24:39how do you get around the problem of so
24:40someone tweets elon rocks and it's
24:42tweeted by two million people um and um
24:46and then then after that they edit it so
24:48i'm elon sucks and um and then all those
24:52they're all embarrassed and how how do
24:54you avoid that type of
24:56changing of meaning so that retweeters
25:02well i think uh you know you'd only have
25:05the edit capability for a short period
25:07of time and probably the thing to do at
25:10upon the edit would be to zero out
25:12all retweets and favorites
25:16i'm open to ideas though you know
25:19so in one way the um algorithm works
25:21kind of well for you right now i just i
25:23wanted to show you this this is so
25:25this is a typical tweet of of mine kind
25:28of lame and wordy and whatever and look
25:30at and the amazing response it gets is
25:36um and then i tried another one um
25:4529 000 likes so the algorithm at least
25:48seems to be at the moment you know if
25:50elon musk expanded the world immediately
25:58yeah i guess so i mean that was
26:01i mean you but but you've
26:03so help us understand how it is you've
26:05built this incredible
26:07um following on twitter yourself when
26:10i mean some of the people who love you
26:12the most look at some of what you tweet
26:15they think it's somewhere between um
26:18embarrassing and crazy some of it's
26:24is that actually why it's worked or why
26:29i mean i don't know i mean i i'm
26:31you know tweeting more or less stream of
26:33consciousness you know it's not like let
26:35me think about some grand plan about my
26:36twitter or whatever you know i'm like
26:38literally on the toilet or something i'm
26:40like oh this is funny and then tweet
26:45that's like most of them
26:48you know over sharing
26:51but um but you are obsessed with getting
26:53the most out of every minute of your day
26:55and so why not you know
27:00i don't know i just like try to tweet
27:02out like things that are interesting or
27:04you know and then people seem to
27:08so if if you are unsuccessful actually
27:11before i ask that let me ask this if i
27:14yeah so how can i say
27:16is uh funding secured
27:23i i have sufficient uh assets to
27:29it's not a forward-looking statement
27:33i have to i mean i can do it if possible
27:41i mean i should say actually even in the
27:45uh with with tesla back in the day
27:48funding was actually secured
27:50i want to be clear about that
27:51um in fact this may be a good
27:53opportunity to to to clarify that um
27:56if funding was indeed secured um and uh
28:00i should say like why why do i do not
28:02have respect for the sec in that
28:04situation and i don't mean to
28:06blame everyone at the sec but certainly
28:08the san francisco office
28:10um it's because the sec
28:12knew that funding was secured
28:15but they pursued the
28:17an active public investigation
28:18nonetheless at the time tesla was in a
28:21precarious financial situation
28:23and i was told by the banks that if i
28:25did not agree to settle with the sec
28:27that they would the banks would cease
28:28providing working capital and tesla
28:30would go bankrupt immediately
28:32so that's like having
28:34a gun to your child's head
28:36so i was forced to concede to the sec
28:44and and and now that they they say
28:47it makes it look like i lied when i did
28:49not in fact lie i was i was forced to
28:51admit that i lied for to save tesla's
28:53life and that's the only reason
28:55given what's actually happened
28:59given what's actually happened to tesla
29:01since then though aren't you glad that
29:03you didn't take it private
29:10it's difficult to put yourself in the
29:11position at the time tesla was under the
29:13most relentless short seller attack in
29:16the history of the stock market
29:18uh there's something called short and
29:21um where the barrage of negativity that
29:24tesla was experiencing from short sales
29:26wall street was beyond or belief tesla
29:29was the most shorted stock in the
29:31history of stock markets
29:33this is saying something
29:36you know this was affecting our ability
29:38to hire people it was affecting our
29:39ability to sell cars
29:44yeah it was terrible um
29:47yeah they wanted tesla to die so bad
29:51well most of them have paid the price
30:02so that was a really strong statement i
30:04mean obviously a lot of people
30:05who who support you i thought would say
30:10offer the world on the upside on the
30:12vision side don't don't waste your time
30:14getting getting distracted by these
30:16these battles that bring out negativity
30:18and and and make people feel that you're
30:20being defensive or like people don't
30:22like fights especially with with
30:23powerful government authorities they'd
30:25rather they'd rather buy into your to a
30:27dream do do you like aren't you
30:29encouraged by people just just to edit
30:34you know temptation out and uh
30:37go with the bigger story
30:41um well i mean i i would say like you
30:43know i'm sort of a mixed bag you know i
30:48mean well you're a fighter and you you
30:51don't you don't you don't you don't
30:54you don't like to lose and and you you
30:56you are determined that you don't
30:58basically i i mean you are sure i don't
31:00like to lose i'm not sure many people do
31:02but the truth matters to me a lot really
31:07sort of pathologically it matters to me
31:09okay so so you don't like to lose if in
31:12this case you are not successful in you
31:16does not accept your offer you've said
31:18you won't go higher is there a plan b
31:29i i think we i think we would like to
31:31hear a little bit about plan b
31:37for it for another time i think
31:39another time yeah all right
31:44i that that's a nice tease all right so
31:53try to understand this brain of yours
31:55more ilan i i if with your permission
31:58i'd like to just play this this is the
32:00oh actually before we do that
32:02um here was one of the of the thousands
32:04of questions that people asked i thought
32:06this was actually quite a good one um if
32:08you could go back in time and change one
32:10decision you made along the way
32:11do your own edit button
32:13which one would it be and why
32:16do you mean like a career decision or
32:18just any decision over the last
32:21few years like your decision to invest
32:23in twitter in the first place or your
32:26anything um i mean the
32:30the worst business decision i ever made
32:34not starting tesla with just jb straval
32:38by far the worst decision i've ever made
32:40is not just starting tesla with jb
32:43that that that's the number one by far
32:46all right so jb strabo was was the
32:47visionary co-founder who who who was
32:49obsessed with and knew so much about
32:51batteries and your your decision to go
32:54with tesla the company as it was meant
32:56that you got locked into what you
32:58concluded it was a weird architecture
33:01there's a lot of confusion tesla
33:04tesla did not exist in any
33:07tesla was a shell company with no
33:08employees uh no intellectual property
33:10when i invested but the
33:13a false narrative has been created by um
33:16one of the other co-founders uh martin
33:17everhard and i don't want to get into
33:19the nastiness here but uh
33:22i didn't invest in an existing company
33:24we created a company yeah and
33:27ultimately the creation that company uh
33:34unfortunately there's a someone else and
33:37another co-founder who has made it his
33:40uh to make it sound like he he created
33:42the company which is false wasn't there
33:45right at the heart of the development of
33:48the tesla model 3 where tesla almost
33:50went bankrupt and i i think you have
33:52said that part of the reason for that
33:54was that you overestimated the extent to
33:57which it was possible at that time to
33:59automate a a factory a huge amount was
34:03kind of over automating and it didn't
34:06and it nearly took the company down is
34:13first of all it's important to
34:15understand like what what has tesla
34:18actually accomplished that is that is
34:20most noteworthy um it is not the
34:25an electric vehicle or creating
34:27electrical vehicle prototype or
34:29low volume production
34:32of a car that they've been
34:35uh hundreds of cars startups over the
34:37years hundreds and uh in fact at one
34:40point um bloomberg counted up the number
34:42of electric vehicle startups and they i
34:44think they got to almost 500. yeah so
34:47the hard part is not creating a
34:48prototype or going into limited
34:51the the the absolutely difficult thing
34:54which has not been accomplished by an
34:55american car company in 100 years is
34:58reaching volume production without going
35:02is the actual hard thing
35:04um the last company american company to
35:07reach volume production without going
35:08bankrupt was chrysler in the 20s right
35:12and and and it nearly happened to tesla
35:16yes it but it's not like oh geez i guess
35:18if we just done more manual stuff things
35:20would have been fine
35:21of course not uh that is definitely not
35:27we basically messed up
35:29almost every aspect of the model 3
35:36from cells to packs to
35:42body line the paint shop
35:49everything everything was messed up
35:51um and i lived in that fa i lived in the
35:54fremont and and nevada factories
35:59fixing the that production line running
36:01around like a maniac
36:03through every part of that factory
36:06living with the team
36:10i slept on the floor
36:13the team who was going through
36:17could see me on the floor
36:21that they knew that i was not in some
36:25whatever pain they experienced i was i
36:27had it more and some people who knew you
36:31actually thought you were making a
36:32terrible mistake that you were driving
36:35you were driving yourself to the edge of
36:37sanity almost and yeah and
36:39and that you were in danger of making
36:43choices and in fact i heard you say last
36:45week elon that that you because of
36:47tesla's huge value now and and you know
36:50the the significance of every minute
36:52that you spend that you are in danger of
36:56obsessing over spending all this time to
36:57the point of to the edge of
37:03that doesn't that doesn't sound super
37:04wise isn't that like your your your time
37:09your your completely sane centered
37:11rested time and decision making
37:14is more powerful and compelling than
37:17that sort of i can barely
37:19hold my eyes open so surely it should be
37:22an absolute strategic priority to look
37:28i mean there wasn't any other way to
37:32there were three years of hell
37:40this longest period of excruciating pain
37:45there wasn't any other way and we barely
37:47made it and we're on the ragged edge of
37:49bankruptcy the entire time
37:52so when you felt like i want
37:58those were three or three so so much
38:01but it had to be done or tesla would be
38:03dead when you looked around the
38:05gigafactory that we saw images of
38:08um last week and just see where the
38:11companies come i mean do you feel that
38:13that this this challenge of
38:16figuring out the the new way of
38:18manufacturing um that you that
38:21you actually have an edge now that it's
38:23different that you've figured out how to
38:29what won't be repeated you've actually
38:31figured out a new way of manufacturing
38:36at this point i think i know
38:39more about manufacturing than anyone
38:41currently alive on earth
38:50i'll tell you i can tell you how every
38:51damn part part in that car is made
38:53which basically if you just live on the
38:55factory live in the factory for three
38:58that was nice that was a poignant note
39:03someone wants to compose a symphony to
39:05that uh expression of confidence uh
39:07something like that i have no idea what
39:12every aspect of a car six weeks to
39:16talk about scale right now you're in the
39:18middle of writing your new master plan
39:21and you've said that scale is at the
39:25why does scale matter why are you
39:26obsessed with that what are you thinking
39:30in order in order to accelerate the
39:32advent of sustainable energy
39:34uh there must be scale
39:36because we've got a transition um a vast
39:39economy that is currently uh overly
39:41dependent on fossil fuels to a
39:43sustainable energy economy one where the
39:48yeah i mean we got to do it
39:54so so the energy's got to be sustainably
39:56generated with wind solar uh hydro
40:00geothermal i i'm a believer in nuclear
40:02as uh as well i think ever talk about
40:07since solar and wind is intermittent you
40:08have to have stationary storage
40:10batteries and and then we're going to
40:12transition all transport um
40:17if we do those things we have a
40:18sustainable energy future the faster we
40:20do those things the less risk we
40:25put to the environment
40:27uh so sooner is better uh and and so
40:31scale is very important um
40:33you know it's not about it's not about
40:35press releases it's about tonnage what
40:39of batteries produced
40:42and obviously done in a sustainable way
40:44and and our estimate is that
40:47approximately 300 terawatt hours of
40:50battery storage is needed to transition
40:55electricity and and heating and cooling
40:58uh to a fully electric situation others
41:01there's there may be some
41:03different estimates out out there but uh
41:06our estimate is 300 terawatt hours yeah
41:09so we dug into this a lot in the
41:10interview that we recorded last week and
41:12so people can go in and hear that more
41:13but i mean the context is that is i
41:15think about a thousand times the current
41:17install battery capacity i mean the
41:21breathtaking basically yeah and and and
41:25yeah so so your vision is to commit
41:27tesla to try to deliver on a meaningful
41:30percentage of what is needed yeah and
41:32what and call on others to do the rest
41:34that this is what this is a task for
41:36humanity to massively scale up our
41:38response to change change the energy
41:43it's like basically how fast can we can
41:45we scale um and encourage others to
41:50to get to that 300 terawatt hour
41:53installed uh base of batteries right
41:57and then of course uh there'll be a
41:59tremendous need to recycle those
42:00batteries which is i and it makes sense
42:02to recycle them because the raw
42:03materials are like high grade ore um so
42:07people shouldn't think well they'd be
42:08this big pile of batteries now they're
42:09going to get recycled because the
42:11even a dead battery pack is worth about
42:12a thousand dollars so
42:15um but but this is what's needed for a
42:17sustainable energy future so we're going
42:19to try to take the set of actions that
42:21accelerate the day of and bring the day
42:23of a sustainable energy future sooner
42:30there's going to be a huge interest in
42:32your master plan when you when you
42:34publish that um meanwhile i just i would
42:39what goes on in this brain of yours
42:41because it is it is a pretty unique one
42:42i want to play with your permission this
42:44very funny opening from snl saturday
42:47night live can we have the volume there
42:48actually please sorry
42:50it's an honor to be hosting saturday
42:52night live i mean that
42:54sometimes after i say something i have
42:59so people really know that i mean
43:02that's because i don't always have a lot
43:05international variation in how i speak
43:09which i'm told makes for great comedy
43:13i'm actually making history tonight as
43:16with asperger's to host snl
43:21and i think you followed that up with at
43:23least the first person to admit it the
43:24first person to admit it
43:32so this was a great thing to say
43:34but i i would love to
43:38whether you know how you think of of
43:40asperger's like whether you can give us
43:42any sense of even you as a boy how what
43:49understand with the benefit of hindsight
43:51can you talk about that a bit
43:53well i think i think everyone's
43:55experience is going to be somewhat
43:58but i guess for me the
44:00social cues were not uh intuitive so
44:05i was just very bookish and i didn't
44:13sort of intuitively understand uh
44:17what watches meant by something
44:20i would just tend to take things very
44:21literally as just like the words
44:24as spoken word exactly what they meant
44:28didn't turn out to be wrong
44:31you can't they do not they're not simply
44:32saying exactly what they mean
44:34there's all sorts of other things that
44:35are meant it took me a while to figure
44:40i was you know bullied quite a lot
44:46i didn't i did not have a sort of happy
44:49childhood to be frank was quite quite
44:54but i read a lot of books i read lots
44:57and so that you know
45:01gradually i sort of understood more from
45:03the books that i was reading and watched
45:12but it took it took me it took me a
45:14while to understand things that most
45:18intuitively understand
45:20so i've wondered whether it's possible
45:23that that was in a strange way an
45:25incredible gift to you and and
45:28indirectly to many other people
45:33brains you know are plastic and they
45:36they they go where the action is and if
45:39in for some reason the external world
45:41and social cues which so many people
45:43spend so much time and energy and mental
45:45energy obsessing over if that is partly
45:48isn't it possible that that is partly
45:56the world at a much deeper level than
46:01i suppose that's certainly possible um
46:05i think this may be some value also from
46:08a technology standpoint because
46:11i found it uh rewarding to spend all
46:13night programming computers
46:15um just by myself and
46:20most people don't enjoy typing strange
46:22symbols into a computer by themselves
46:26not fun but i thought it was i really
46:28liked it um so so i just programmed all
46:33um i found that to be quite enjoyable
46:37um but but i think that is not uh normal
46:43so i mean it does you know i've thought
46:48it's a riddle to a lot of people of of
46:50how you've done this how you've
46:51repeatedly innovated in these different
46:53industries and it it does you know every
46:55entrepreneur sees possibility in the
46:58future and then acts to make that real
47:01it it feels to me like you see
47:03possibility just more broadly than
47:06almost anyone and can connect with so
47:08you see scientific possibility based on
47:11a deep understanding of physics and
47:14knowing what the fundamental equations
47:16what the technologies are that are based
47:18on that science and where they could go
47:19you see technological possibility and
47:21then really unusually you combine that
47:25economic possibility of like what it
47:27actually would cost is there a system
47:29you can imagine where you could
47:31affordably make that thing and that that
47:35sometimes you then get conviction that
47:37there is an opportunity here put those
47:39pieces together and you could do
47:44yeah i think one aspect of whatever
47:46condition i had um was i was just
47:49absolutely obsessed with truth
47:51just obsessed with truth
47:55and and so the obsession with truth is
47:58why i studied physics
48:01because physics attempts to understand
48:04the truth the truth of the universe
48:06physics just it's just what are the
48:09provable truths of the universe
48:12and and true and truths that have
48:16so for me physics was sort of a very
48:18natural thing to study
48:21nobody made me study it it was
48:23intrinsically interesting
48:24to understand the nature of the universe
48:27and then computer science
48:29or information theory
48:32also to just i understand uh logic and
48:37you know there's an also there's an
48:40you know that you the that information
48:42theory is actually operating
48:44at a more fundamental level more
48:45fundamental level than than even physics
48:53the physics and information theory uh
48:55were really interesting to me so when
48:58you say truth i mean it's it's not
49:02so it's what you're talking about is the
49:04truth of the universe like the
49:05fundamental truths that drive the
49:07universe it's like a deep curiosity
49:09about what this universe is why we're
49:11here simulation why not you know we
49:14don't have time to go into that but i
49:15mean it's you're just deeply curious
49:19what this is for what this is this whole
49:21thing yes i mean i think the why the why
49:24of things is very important
49:28uh when i was a i don't know
49:33uh i i got quite depressed about the
49:36um and i was trying to sort of
49:38understand the meaning of life looking
49:40at reading religious texts and and
49:44reading books on philosophy
49:46and i got into the german philosophers
49:47which is definitely not wise if you're a
49:49young teenager i have to say
49:52can be ripped out but dark
49:57much better at as an adult i um and and
50:00then actually i ended up reading um the
50:02hitchhiker's guide to the galaxy
50:05which is actually a book on philosophy
50:08just sort of disguised as a silly humor
50:11book but but actually the book it's
50:17uh adams uh makes the point that
50:21the question that is harder than the
50:24um you know this sort of makes a joke
50:26that the answer was 42. um that number
50:28does pop up a lot um
50:31and 420 is just 10 14 10 10 times 10
50:34times more significant than 42. okay
50:38you know there's um you can make a
50:40triangle with 42 or 42 degrees and two
50:46so there's no such thing as a perfect
50:48triangle or is there
50:52but even more important than the answer
50:56is the questions that was the whole
50:58theme of that book i mean is that yeah
50:59basically how you see meaning then it's
51:01the pursuit of questions yeah so i have
51:05you know a proposal for a world view or
51:07a motivate a motivating philosophy which
51:12what questions to ask about the answer
51:14that is the universe and the to agree
51:17that we expand the scope and scale of
51:21biological and digital
51:22uh we will be better able to to uh ask
51:26these questions to frame these questions
51:30why we're here how we got here what
51:32what the heck is going on
51:34and so that that is my driving
51:36philosophy is to expand the scope and
51:38scale of consciousness that we may
51:39better understand the nature of the
51:42elon one of the things that was most
51:47was uh was seeing you hang out with your
51:49kids um here's if i may
51:53it looks vaguely like a ventriloquist
51:57i mean how do you know that's real
52:01so that's x and and you know you're it
52:04was just a delight seeing seeing you
52:06hang out with him and
52:09what what what's his future
52:12going to be i mean i don't mean him
52:14personally but the world he's going to
52:17what future do you believe he will grow
52:22well i mean a very digital future
52:32a very a different world than i grew up
52:37but i think we want to obviously do our
52:38absolute best to ensure that the future
52:40is good uh for everyone's children
52:46you know that the future is something
52:48that that you can look forward to and
52:52you know you want to get up in the
52:53morning and be be excited about the
52:54future and we should fight for the
52:56things that make us excited about the
52:58future you know the future cannot it
53:01cannot just be that one
53:03miserable thing after another solving
53:05one sad problem after another there got
53:07to be things that get you excited like
53:09you're like you want to live
53:12these things are very important
53:15you should have more of it
53:17and it's not as if it's a done deal like
53:19it's all it's all to play for like the
53:23horrible still there are scenarios where
53:24it is horrible but you you see a pathway
53:28to an exciting future both on earth and
53:34in our minds through artificial
53:35intelligence and so forth i mean in your
53:37in your heart of hearts do you really
53:39believe that you are helping deliver
53:48i'm trying my hardest to do so
53:56i love humanity and i think
53:59we should fight for a good future for
54:01humanity and i think we should be
54:02optimistic about the future and fight to
54:04make that optimistic optimistic future
54:13i think that's that's the perfect place
54:15to close this thank you so much for
54:16spending time coming here and for the
54:19work that you're doing and good luck
54:21with finding a wise course through on
54:23twitter and everything else all right