00:00look there's the other scenario and I
00:01would just call that one the Cheetos and
00:02meth scenario and PlayStation and
00:05Playstation right and like and I like
00:08Netflix I'm a fan of Netflix but like
00:09maybe not 12 hours a day that's the
00:12cow cows are great um but like I don't
00:21cows hello and welcome to the mark and
00:24Ben podcast today we're going to talk
00:26about a post that Mark recently wrote uh
00:30called the Techno Optimist
00:33Manifesto and like all good manifestos
00:36many people loved it many people hated
00:39it uh and so there that's given us a lot
00:41to talk about um I just want to actually
00:44point out my favorite uh of the uh
00:48people that hated it was an article that
00:51was published in TechCrunch called uh
00:53when's the last time Mark andreon has
00:55spoken to poor people or a poor person
00:58or something like that uh and the the
01:00thing that's so funny about it is that
01:03Mark of all the people I know I probably
01:06don't know anybody who's more self-made
01:08than mark because he grew up in a Tiny
01:11Town in Wisconsin he went to Public
01:13Schools like not good Public Schools
01:15like probably some of the uh worst
01:18public schools in the country and um and
01:22like never got any money you know from
01:24home not because his parents didn't love
01:25him they didn't have any money to give
01:26him uh so and then the people who wrote
01:31all went to like the fanciest schools
01:33I've ever heard of um and you know Ivy
01:35Leagues and wonderful private high
01:38schools and these kinds of things so now
01:40we have people who grew up rich telling
01:43somebody who grew up poor and massively
01:45succeeded what's good for poor people
01:49who want to succeed so that's I just
01:52thought that was so funny
01:54anyway so this one's gotten a lot of
01:57kick to it so we're going to get right
01:59into it um the first question can I hey
02:01Ben can I go can I weigh in on that yes
02:03sir can I weigh in on that you you can't
02:05start that way and then not let me talk
02:06not let me say something
02:08sorry um so it's too tempting um look
02:12it's it's a uh the you know that sort of
02:15that response is it's a classic example
02:17of what Rob what Rob Henderson the
02:18author Rob Henderson calls luxury
02:20beliefs right and so right and so luxury
02:24belief right the definition of luxury
02:25belief is it's a belief that could be
02:26held by somebody who's in sort of a you
02:28know sort of an elite position uh you
02:30know position of power position of of of
02:32wealth and comfort um you know about how
02:34Society should be ordered that is
02:36incorrect um and the consequences of
02:38which would be disastrous right for the
02:39people uh that would be subjected to the
02:42consequences of that belief um but uh
02:44you know the people who hold the belief
02:45are insulated from the consequences
02:47right they live in you know kind of
02:48fancy places and and and have very good
02:50Lifestyles and and and aren't going to
02:52aren't going to suffer directly as a
02:53result so so anyway it's a classic it's
02:55a great example of luxury belief um you
02:57know the the sort of uh you know the
02:59sort of fasal response to it of course
03:01is that capitalism and free markets um
03:03are are the machine that has lifted
03:05people out of poverty for you know 500
03:07years we've run the experiment the other
03:09systems don't seem to work as well many
03:12many many times you know and we this is
03:13one of those things and you know some
03:14amazing things we ran exactly to your
03:16point wean this experiment you know
03:19hundreds of times in the in the in the
03:2020th century and and the results are
03:21very clear uh and you know look most
03:23recently in China you know there's a
03:24direct correlation between the degree to
03:26which the Chinese Communist Party kind
03:27of takes its boot off the throat of the
03:29people in terms of their ability to
03:30engage in markets and engage in trade
03:32and and the extent to which the their
03:34quality of life Rises and so you you
03:35know you kind of still see that you know
03:37playing out today and how quickly it
03:38reverses when they when they move to
03:40Central planning of 100% And so and you
03:43just see this over and over again and
03:45you know then and then you know you see
03:46it happen in other parts of the world
03:47also and so you know it's this thing
03:49where it's just like more more times
03:52running the same experiment are not
03:53going to generate different results um
03:55and and then the result of this of
03:56course is that this is kind of so
03:58obvious and well established at this
03:59point that you have to go to you know
04:01one of our finest Elite you know private
04:02high schools and private you know
04:03universities to really get
04:06um into the lexury belief system that's
04:10gonna exact trying to help you exactly
04:14so all right now that's a excellent
04:17excellent start um so this first
04:19question is from St Louis of techne what
04:22does effective pessimism look like I how
04:26can people who want to mitigate risks
04:29make sure to waste their time on moral P
04:34progress you know look this is really
04:36tough right and you know let's start by
04:39you know conceding the you know kind of
04:40giving the devil is due kind of
04:41conceding the you know the the strength
04:42of the other side's argument on this um
04:44you know which is like look as I say in
04:46the essay like you know I'm not a
04:47utopian um you know technology is not
04:49purely a Force for good um you know
04:52techn te Technologies are tools and they
04:53can be used for both good and bad and
04:55you know virtually every technology that
04:56man has ever invented um has been used
04:58for both good and bad and so it's it's
05:00not that there aren't downsides it's not
05:05the yeah starting with fire I mean look
05:07like you know this is I reference in in
05:09in the piece uh this you know the myth
05:11of Prometheus which is kind of the
05:12origin myth in in in Western Society of
05:14of sort of the implications of
05:16technology and you know in the
05:18Prometheus myth you know Prometheus is
05:19the is the god that brought um brought
05:21fire to man and you know for that he was
05:22punished by being you know by by by Zeus
05:25by being chained to a rock and having
05:26his liver pecked out every night U by a
05:28bird and then it would regenerate in the
05:30morning and then it would happen again
05:31the next day so you know very Exquisite
05:33form of torture that Zeus came up with
05:35and and and you know the reason that
05:36myth is so powerful is because look fire
05:38fire was the enabler of you know heat uh
05:42and light and cooking food right and
05:45defense and shelter you know for for
05:47early man but it was also a weapon you
05:48know from the very beginning it was a
05:49weapon of War right and if if for
05:51example you're engaged in you know you
05:52know Siege combat and you know you're
05:55you're going up against a fortified you
05:56know Castle or city the way that you win
05:58is you burn them out right right um and
06:00by the way the way they retaliate is
06:01they pour boil you know boiling they
06:03heat up oil to boiling temperatures and
06:05they pour it on your head right so so
06:08yeah so so so so both sides of these are
06:11are true and this you know this this
06:12continues to be the case you know to
06:13this day um and and then basically these
06:15are kind of the two you know actually
06:17the effective pessimism to clever
06:18framing um you know these are kind of
06:20the two kind of I don't know valences
06:22that you could apply to to this question
06:24which is you could apply you know one
06:25veilance is basically fundamentally over
06:27time net you know everything thing
06:29technology has been primarily a Force
06:31for good primarily a force for Progress
06:33um and basically you embrace it and and
06:35and and and support it and accelerate it
06:37as much as you can and then and then you
06:38deal with the issues as as they arise
06:40you know which which is the story of the
06:41development of modern civilization um
06:44you know there there is another
06:45veillance and and and you know people uh
06:47you know some people inclin more more
06:48naturally to to to the pessimistic
06:50position which is basically and this is
06:52true by the way both Technologies and
06:53markets people also apply the same kind
06:55of negative veilance to markets which is
06:57well primarily you know technology or
06:59markets are a generator of bad things
07:01right um you know uh technology is a
07:03weapon of War you know technology is
07:05something that has negative you know
07:06unanticipated negative consequences
07:08markets as as uh you know you know
07:10markets markets have winners and losers
07:11right and so you know do you focus you
07:13know do you start out focusing more on
07:14the winners or on the losers um and so
07:18you just kind of have to decide you know
07:19where where you go in and what what I
07:21find you know the accusation of course
07:22from the pessimist is the optimists are
07:24too optimistic you know the the counter
07:26accusation of course is if you start out
07:29with a pessimistic frame it's very hard
07:31to hold that in a moderate position is
07:33what I is what I observe um the the
07:35pessimist sort of slide into greater and
07:37greater levels of pessimism quite
07:39quickly um and you know they end up very
07:42angry and bitter and hostile um and they
07:44end up advocating for extremely you know
07:46I would say Draconian and kind of
07:47Senseless policies um and so I I think
07:50it's hard to Be an Effective pessimist
07:53um you know yeah I think it's it seems
07:55to be much easier to become a a I would
07:57say you know either ineffective
07:58pessimist or just a flat out dangerous
07:59best most yeah you know Andy Grove had a
08:01great line on this somebody asked him uh
08:04was the microprocessor good or bad and
08:07he said well that's a crazy question
08:09it's like asking is steel good or bad it
08:12is like you're not going to hold back
08:14progress and so the what you have to ask
08:16yourself is you know how do you how do
08:19you make it good but but don't try and
08:22do that by Banning it because well SE at
08:25that you're just gonna get
08:27frustrated yeah I mean the only thing
08:29argue with Andy on that is you know they
08:30they do get banned right um and so um
08:33you know steel didn't get banned but uh
08:34you know nuclear civilian nuclear power
08:36did yeah right um and so you know the
08:39pessimists I mean look like I I I
08:40mentioned in the piece you know I think
08:42the single biggest policy mistake of my
08:44lifetime was the decision of the 70s
08:46effectively in the 70s into the 80s to
08:48ban essentially ban civilian nuclear
08:49power and and you know for sure
08:51throughout most of the us and then you
08:52know throughout certainly most of Europe
08:54as well with you know maybe France being
08:56the big exception um and then and then
08:57by extension throughout you know a lot
08:58of the rest of the world because it
08:59would have been Western you know
09:00countries and and and companies that
09:01would have brought it to to the rest of
09:03the world in that in that time period um
09:05and you know look I think the
09:06consequences for that like I think I
09:08think if you're environmentalist and you
09:09you kind of are looking at things
09:10dispassionately as Stuart brand and
09:12others you know have been doing now you
09:13know for a while is you kind of say you
09:15know look we had the Silver Bullet for
09:16sort of unlimited zero emission energy
09:19um and we had it and we chose not to use
09:21it um and you know with everything we
09:23know today it's overwhelmingly both you
09:25know the the safe effective and you know
09:27kind of zero you know kind of risk you
09:29know zero risk of of you know of mass
09:31death zero risk of contributing to you
09:33know Carbon emissions and so forth um
09:35but you know look we made it we made we
09:36collectively made a political decision
09:37to ban it and we're you know paying the
09:39price for that today and you know quite
09:41frankly it's one of the reasons why
09:42Russia is able to do what it's doing in
09:43Ukraine is it has this flow of money
09:44from from oil um you know which in a
09:47counterfactual universe where the World
09:48by now had cut over to civilian electric
09:50power like they they wouldn't have that
09:51they wouldn't be able to do what they're
09:52doing so the the consequences of that
09:54decision play out decades later um and
09:56and I I think that's a great
09:57illustration of the risk of let's say
09:59not the risk of effective pessimism
10:00let's say the risk of of of dangerous
10:05dramatic example of narrative defeating
10:08data because it's very very obvious from
10:10the data that nuclear is far safer than
10:13say oil or coal all right second
10:16question and this is from your friend
10:18and mine Shaka Sor um how can we
10:21distribute transformative philosophies
10:24like yours to marginalize communities
10:27where a culture of Despair and
10:28victimization predominates
10:31predominates yeah so this is a really
10:33interesting question for me because I
10:35think that it it does get to the heart
10:37of like one of the reasons why uh
10:40victimization predominates is because um
10:44it's the kind of it's a techn
10:48pessimistic World um you know in those
10:51communities in the marginalized
10:52communities and I I I would say I go
10:55back to um you know there have been uh
10:59some very kind of interesting and
11:01successful leaders uh Marcus Garvey
11:04comes to mind who and I think it starts
11:08with you know something that he really
11:11really was um heavily behind which is
11:15self-determination that in an individual
11:19can change their change their own lives
11:22change their own circumstances and then
11:24you know he had this idea uh at the turn
11:27of the pre of the turn of the 18th and
11:301900s um and you know which was a kind
11:34of much more difficult time to do that
11:37particularly for kind of black people in
11:39America um but you know he himself
11:42succeeded at it greatly and I think it
11:45really starts with that mindset where
11:49believe um that you can be successful in
11:52life or with a new technology or with
11:55kind of moving the world forward then
11:57you can't and Henry Ford's famous line
11:59you know there are two men uh one
12:01believes he can do it the other believes
12:03he can't can't do it and they're both
12:05right I think is the key to that whole
12:08thing is it just starts with the belief
12:10that you can succeed um and look the
12:13odds are harder for some than others but
12:15um you know you always end in despair if
12:18you believe you can't do
12:19it yeah yeah and I just add you know
12:23building a little bit on the on the uh
12:25on the the sort of uh
12:27uh argument um you know look on the
12:31consumption side so one this is sort of
12:33one of the amazing things about free
12:34markets like free markets are the most
12:36beneficial to the lowest income and the
12:38most advantaged and that is you know a
12:40you know that's a you know
12:41counterintuitive mind Bender for a lot
12:42of people who have been trained at El
12:48Direction but it's the the truth is it's
12:50it's the best for the poorest it's the
12:52best for the people who have the least
12:53um and by the least I mean you know the
12:55least existing wealth but also the least
12:56access you know the least social status
12:59um right um who are kind of on the
13:01receiving end of of not a lot of not a
13:02lot of advantage in their lives and and
13:04and you can kind of look at that on on
13:05two sides of their lives you can look at
13:06that for them as producers and as
13:07consumers um so on the production side
13:10you know markets open up opportunity
13:12right uh for uh you know for people to
13:14be able to make their way in the world
13:15and for you know to be able to have jobs
13:17be able to make money and then
13:17ultimately be able to support a family
13:20um and so you know again the
13:21counterfactual here you know the
13:22counterfactual is not you know a poor
13:24person right trying to navigate their
13:26way through you know the hell of a
13:27capitalist economy vers somehow in a in
13:29a socialist or communist system they'd
13:31be handed you know kind of handed
13:32everything they need right the reality
13:33is you know it's the other way around
13:35it's it's it's sort of the the the more
13:37capitalism the more opportunities there
13:38are the more available jobs the more
13:41lines right the more lines of work the
13:43more the the more openness and freedom
13:46and choice uh to be able to figure out
13:48how to uh you know how to succeed um and
13:50how to make money and what work to do um
13:53you know you end up on the wrong side of
13:55a you know authoritarian you know
13:57Communist Regime or a socialist regime
13:58or authoritarian regime right um uh you
14:02are you know you are screwed like there
14:05are no jobs for you right nobody's
14:06hiring there are no private employers um
14:09you know if the state you know for
14:11whatever reason doesn't want to hire you
14:12like you're out of luck if you're in a
14:14you know disadvantaged class category if
14:17you're in a you know disadvantaged you
14:19know race ethnicity gender sexual
14:22orientation any of these things or just
14:24you know political views or whatever or
14:26just that you're poor um and people look
14:28down on you like you that's it like
14:30you're done like at that point you're a
14:32ward of the state and you know whatever
14:33amount of you know small amount of grain
14:34they want to feed you to keep you alive
14:36fair enough but like you're not you're
14:37not going anywhere right and that and
14:39that was you know that was the story of
14:41of low-income people in most societies
14:42you know over over basically the
14:44entirety of human existence up to the
14:45point of the invention of markets and so
14:48so that's on the one side and then on
14:49the consumption side um I talk about
14:52this also in the sa is one of the big
14:54things that technology does uh in the
14:55free market context is it drives prices
14:57down um and and and this is a big thing
15:00on on inequality and especially income
15:01inequality that people I think Miss
15:03which is like one form of like
15:05determining the level of like income
15:07inequality or the the value of one's
15:08income or whatever right is to look at
15:10it in terms of like you know what
15:11literally the you know as demarcated
15:13measured by units of currency right and
15:15so I either I'm either making more money
15:16or less money look the other side of it
15:18right of that is what is money used for
15:20it's used to buy goods and services and
15:21so if the price of goods and services is
15:24falling that's the same effective thing
15:26to you from a standard of living
15:27standpoint as if you're getting actal
15:29raise yeah right um and so so what
15:31markets do and Technology does is they
15:33drive prices down and the more they're
15:35allowed to operate the the more they
15:37drive prices down the the traditional
15:39way that economists talk about this is
15:40they make the observation which is as
15:42follows which is the Queen of England
15:43always wore Silk Stockings right like
15:46the super rich throughout all of society
15:48you know throughout all of time have
15:50always had access to the best of goods
15:51and services you know great you know the
15:53best of available food and the best of
15:54available healthare right and so forth
15:56and so on you know of their time and
15:58place you know it was traditionally like
16:00that that stuff was all just you know
16:02Silk Stocking stockings and everything
16:03else right were just completely out of
16:05reach of most people and and it is
16:06precisely the engine of free markets and
16:08technology that bring down prices uh so
16:10that regular people can afford these
16:11things uh uh as well yeah and I actually
16:14the you know maybe the most profound
16:16example of that is kind of the internet
16:18plus the smartphone because you you know
16:20when you and I grew up um information uh
16:24was kind of an elite thing to get to and
16:27you know a big reason to go to
16:28university was the knowledge was all
16:31there it was in books and libraries and
16:33all these kinds of things and you
16:34couldn't get you know you didn't
16:36actually have access to that otherwise
16:38um and and computers by the way also we
16:42we didn't have computers you know for
16:43for individuals and um and now you know
16:47the at least you know the the the kind
16:49of poorest people in America the
16:51homeless in San Francisco have better
16:53access to information and knowledge than
16:56the president of the United States did
16:58in you know 1980 uh which is
17:01unbelievable um Soh to your point okay
17:05let me give you give you one other one
17:07on that and this this I find this
17:08totally mind-blowing so you know when we
17:10you Ben you remember when we first
17:11started working on the internet in the
17:1290s you know there was just this sort of
17:14endless kind of hand ringing at the time
17:16about this concept of the digital divide
17:18yes right um and right this this concept
17:20of basically you know digital technology
17:22the internet computers uh PCS and the
17:24smartphones were going to drw we're
17:25going to basically widen inequality
17:26because they it was basically well off
17:28people that were going to have them and
17:29then poor people wouldn't right um and
17:30then there was you know this is you know
17:32this is maybe the the effective
17:33pessimism of its time people were very
17:35very worried about that you know look
17:36sitting here today in 2023 the following
17:38is true uh more people in the world have
17:41computers in the form of smartphones
17:42specifically and internet access um than
17:45have electricity or running water in
17:47their homes yeah amazing right um right
17:51so the the the digital Technologies
17:52people worried about are actually the
17:53most egalitarian right of all
17:55technologies that have ever ever been
17:56produced even more than running water
17:58and electricity um and and so you know
18:01we so I mean we we should still be
18:02worried about like literally the Gap in
18:04access to fresh water like more than we
18:05should be access worried about the the
18:07Gap and access to like the you know the
18:08internet and again there one the water
18:12divide right the electricity divide is
18:14like still a thing but the the the
18:16digital technology divide actually turns
18:17out to not not be a thing and of course
18:19again the reason for that the reason for
18:20that is very straightforward the reason
18:21for that is is is is falling prices the
18:23reason for that is you know as the
18:25global smartphone market went to five
18:26billion people you know the the price of
18:28a smartphone collaps to I don't even
18:30know today and you know in sort of the
18:31developing world you know it's I don't
18:33know 10 bucks or something um and then
18:35and then same thing you know internet
18:36access has plummeted uh in price over
18:38time uh because of mors Law and
18:40competition and and and Innovation um
18:42and so so again like if so if you if you
18:45if you the Paradox the the the flip side
18:47of this is if if you wanted a plan to be
18:50able to drive you know a something any
18:52form of good or service that is
18:53important to lots of people um to have
18:55it be available to everybody you know
18:57the thing to do is to lean hard hard
18:58into into markets and into technology
19:00right not not further away yeah great
19:02great Point okay next question from Max
19:10um somebody's gonna get me with one of
19:12these that's going to be like one of
19:14these kind of Simpsons jokes but uh
19:17technology makes life easier and
19:19necessary for a better future however
19:22how would you address humans getting
19:24overly dependent on Tech to a point
19:26where we can't function without it
19:29yeah yeah so this is I would say there's
19:32kind of the full dystopian version of
19:33this um you know which is sort of the
19:35Wall-E scenario right um you know the
19:37movie in the movie Wall-E uh for those
19:39of you who haven't seen it you know
19:40Mankind in the future basically you know
19:42is all just like you know obese and you
19:44know literally sitting in these big like
19:45suspension zero gravity suspension
19:46chairs and you know basically you know
19:48binging binging Netflix and slurping you
19:50know some of that going
19:53now yeah this might sound a little a
19:55little familiar but in in in our times
19:58but but yeah look there there there is
19:59this sense of like okay we we kind of
20:01you know at some point like you know we
20:02live in sort of this automated you know
20:04I don't know Farm environment or
20:05something and we're kind of farm animals
20:06and we're we're being kept fat and happy
20:07but um you know we've kind of lost
20:09agency we've lost Free Will we've lost
20:11Choice we've lost you know any sort of
20:12you know sense of self-reliance
20:14self-sufficiency any sense of adventure
20:17um and you know look I I you know like I
20:18think there you know there's there's
20:19there's certainly some you know some
20:21some argument uh you know kind of in
20:22that direction you do see examples of
20:23that um you know what I what I try to do
20:26what I try to do in the essay and what
20:27what I believe on this is a little it's
20:28a little bit a little bit subtle um
20:30which is look you know there are really
20:33big questions uh about the meaning of
20:35life um that you know people have today
20:38and have had for a very long time right
20:40and a lot of the history of human
20:42civilization has been you know debates
20:44around you know religion and you know
20:47which gods to worship and moral
20:49principles and how to order a society
20:51and what you know the role of the
20:52collective versus the role of the
20:53individual is and you know all these
20:55policy questions that flow out from this
20:57and like you know the story of human
20:58civilization is in some way the story of
21:00trying to try to figure out all those
21:01questions and of course these questions
21:02are still you know at least for a lot of
21:04people still unanswered or are still
21:05open questions or being open you know
21:07open open to new all the time um and and
21:09and and I think it's putting frankly too
21:11much of a burden like I'm a you know I'm
21:12a I'm a I'm an enthusiastic a proponent
21:14of technology and markets but it's it's
21:16putting a little bit too much of a
21:17burden on technology and markets to
21:18expect technology and markets to answer
21:20all those questions yeah for all people
21:21and so I I I think if you're looking to
21:23technology and Markus answer those
21:24questions I think you're probably
21:25looking in the wrong direction you know
21:27I think you probably need to inside you
21:28know quite honestly inside the human
21:30soul um you know where all where all the
21:32hard questions lie um and then the the
21:34the observation I make is um Rising
21:37technological capabilities and Rising
21:39standard of living right through markets
21:41um open up the room right individually
21:44and collectively in our society to be
21:46able to ask those questions right so and
21:48the most obvious example of that is look
21:49when people are hungry they don't ask
21:51any of these questions the only question
21:52is like you know where's where's meal
21:53coming from right and you can kind of
21:55elaborate that all the way up and you
21:57can kind of say look like if the
21:58ultimate human problem is that we're all
22:00you know we're all full you know we have
22:02full bellies our children are taking you
22:04know our children are are are going to
22:05live great lives you know we're able to
22:07support our family we're able to you
22:09know do all the things that technology
22:11is able to give us um and we still have
22:14these big unanswered questions about the
22:15meaning of life then basically what
22:16technology will have done is to open up
22:18the aperture to be able to actually
22:19spend more time trying to figure out
22:20those big questions yeah yeah yeah
22:23that's that that seems like a a very
22:26champagne problem for sure
22:28it's funny it reminds me of you know
22:30when I was in elementary school my
22:32brother was in junior high school I went
22:34to the school play and the play
22:36calculators had just come out and the
22:38whole play was about you know this uh
22:41Society where nobody knew how to do math
22:44and then the calculators all
22:46broke and so you know those kinds of
22:49fears I think go with technology but
22:51they tend not to play out in very real
22:54ways at least all the calculators
22:58yeah well look te technology gives you
23:00the way I here's the other kind of
23:01serious observation to make is
23:02technology increases resilience to
23:04natural disaster uh so so technology
23:06increase like so for example you know
23:08this this comes up actually a lot in the
23:09climate debate um which is basically you
23:11know one one of the and I'm not this is
23:13not qu question climate change making a
23:15different point but you know one of one
23:16of the questions over time is basically
23:17is nature getting more more or less
23:19dangerous right over time right as as as
23:22both as the world changes as Humanity
23:23changes and so forth and and basically
23:25um deaths from natural disaster have
23:28been a systemic decline for you know a
23:30century plus at this point no right like
23:32it you know it used to be that if you
23:34were confronted by you know you
23:36basically any kind of what's that
23:40cold there's no heat yes people yeah you
23:44it was cold you freeze to death you know
23:46if it was hot you know with no air
23:47conditioning you might you know you
23:48might die from heat stroke um you know
23:50any kind of tornado flood mudslide I
23:52mean look you know there was there was
23:53there was literally what was it there
23:55was what was in Boston like 150 years
23:56ago or something there was like a molass
23:59um uh basically Mass tragedy um right
24:02where people like literally drown in in
24:03a Molasses Flood like like nature I mean
24:06of all you know nature is vicious right
24:09like nature really has it out for you um
24:12and if you're unprotected in a state of
24:13nature like it you know the old the old
24:15thing is you know life and state of
24:16nature is is is is what is it nasty
24:19British and short po poor solitary nasty
24:21British and short Hobs yeah exactly and
24:24so so look technology you know it's it's
24:26of the flip side of the question is
24:27techology is now buffering us against
24:29you know sources of mass death um that
24:32used to be you know far far far more
24:34common um and so this is not to you know
24:37this is not to obviously to sort of try
24:39to get away from the kind of the Doom
24:40State question of of like what happens
24:41if it literally all stops working but
24:42like you know how how do we build
24:44defenses against the really bad
24:46scenarios in which that would happen it
24:47actually turns out technolog is our
24:48friend on that yeah yeah yeah and our
24:50our friend Elon is um you know
24:52protecting us against everything by uh
24:55making us an interplanetary species
24:57which also deals with the asteroid
24:59problem yeah yeah look I mean yeah I
25:01mean you know look the the dinosaurs had
25:03no plan B right like
25:07out so John asks how can economic
25:11systems evolve to prevent human
25:13corruption from infiltrating Advanced
25:15Technologies that surpass our capacity
25:18for understanding yeah Ben what Ben what
25:22think or maybe you might maybe you might
25:24narrow on the question a little bit or
25:25make it you know I'm not sure if I
25:27totally understand what he's getting at
25:29here but I think that um I I think human
25:35uh you you know there there is this
25:38agency problem and I think that you you
25:40kind of alluded to it on the kind of
25:43nuclear fishing issue um where you know
25:47as uh as these systems evolve how do you
25:51keep the human interest from fouling
25:54them uh and this kind of gets to you
25:57know a hard of a lot of the things that
25:58you spoke about in the manifesto which
26:02example um you know we had this this
26:05banking crisis and everybody's
26:08intention uh was to basically you know
26:13lessen our Reliance on giant banks that
26:15became as powerful as you know many
26:18governments in the world uh and of
26:20course um because of this uh Baptist and
26:24Bootlegger issue where the banks were
26:26the bootleggers and the
26:28government was the Baptist um you know
26:30they they they we basically got the
26:32opposite we got much bigger Banks much
26:34more powerful and that trend is kind of
26:36continuing forward uh so I think that
26:40you know when we look at how systems get
26:47and and you know and create real crisis
26:50you know we we really have to Beware of
26:53the agency problem and we're going
26:54through um a few of those now I think
26:58both on AI and also on crypto where
27:02actually a lot of the answer to um you
27:06know some of these like huge powerful
27:09monopolies you know like what do you do
27:11about tech companies getting so big and
27:13becoming monopolies what do you do about
27:15Banks getting so big and coming becoming
27:17monopolies we actually have a magic
27:24power actually creates a real form of
27:27stakeholder capitalism where all of the
27:30participants in the economy get rewarded
27:32for building the economy um and the
27:36biggest uh adversaries of that whole
27:39movement end up being the kind of
27:42Baptist in the government you know con
27:44by the uh bootleggers and the big Banks
27:50know kind of over highlighting small you
27:54know real but small dangers of the
27:56technology and try trying to stop the
27:58technology and its tracks and kind of
28:00lead to this horrible agency problem
28:03where you have these very very corrupt
28:07so I I guess my big thing would be you
28:11know like be really careful when
28:13somebody goes oh this is a problem with
28:15a new technology so therefore we have to
28:18stop the new technology like that I
28:21think is is the pattern that's repeated
28:23over and over again and hurt us so badly
28:26on energy and threat threatens to hurt
28:28us on intelligence and threatens to hurt
28:32decentralization yeah that's right yeah
28:34one of the ways to think about this for
28:36people who haven't run into this in
28:37their lives is there there's a
28:38fundamental difference between uh Pro
28:40business and pro market yeah um and they
28:43sound like they're the same thing and
28:44they're not um and so because Pro
28:46business kind of begs the question of
28:48which businesses and then sort of okay
28:49what's the structure of the market and
28:51are we talking about like yeah right
28:53exactly are we talking about lots of
28:54companies having to compete and earn
28:56their way in the world or are we talking
28:57about ultimately crony capitalism
28:58basically and basically this is the
28:59pattern of basically what happens this
29:01is kind of the point of the Baptist and
29:02Bo Leer's idea of what happens which is
29:04basically a new a new techn something
29:05changes uh in the world you have big
29:07incumbent companies that very much are
29:09opposed to that change or want to
29:11control it um and and and want want to
29:13control and want to lock it down um and
29:15so what they do is they go to the
29:16government and they basically say oh um
29:18you need you need to regulate this and
29:20they they don't go in and say you need
29:21to regulate this for our benefit because
29:23they would get laughed at so what they
29:25say so what they say is you need to
29:29yeah that would be a dead giveaway so
29:30instead they say you know you need to
29:31regul regulate this to protect basically
29:34the you know the little people um right
29:36um and and but what they're shooting for
29:38100% of the time what they're shooting
29:40for is basically government uh sanction
29:42barriers to competition for themselves
29:44yeah um and and I would even argue like
29:47we don't even today in America like we
29:48don't here I'll do a little effective
29:50pessimism um you know we don't in
29:52America today in most Industries
29:54actually live in what you call a free
29:55market system we we live in more of a
29:57captured kind of big business cartel uh
29:59ecosystem and and you look just across
30:01sort of sector after sector after sector
30:03what you see are sort of two or three or
30:04four companies that have you know
30:05overwhelming market share uh in each
30:07sector and generate you know an
30:08overwhelming percentage of the profits
30:10and have this extremely incestuous
30:12relationship with the government by the
30:13way often to the point where they're
30:14actually writing their own regulations
30:16right like it's their lobbyists actually
30:18writing the regulations um you know it's
30:20the industry groups that they run well
30:22the classic example the majority of
30:24regulations works works exactly yeah
30:27yeah that's right and of course and then
30:28there's this this other form of
30:29corruption right which is the revolving
30:30door which is like okay if you're if
30:31you're a regulator it is you know
30:33extremely tempting just out a pure
30:34self-interest you know to kind of do
30:36what these big companies want because
30:37they'll hire you right um after after
30:39you're after you're done doing that and
30:40so this is sort of this corruption after
30:41the fact that happens with the revolving
30:43doors and so anyway so so so this
30:45basically that that that you could be
30:47Pro you could be Pro business and be
30:49completely in favor of all of that right
30:51because it is still businesses you know
30:52doing everything at the end of the day
30:54PR market right says no that none of
30:57that that what I just described as
30:58acceptable that's not how things should
30:59work at all the last thing any
31:00government should be doing is giving any
31:02particular company some special right or
31:03privilege uh some ability to block out
31:05new competition and in fact what you
31:06want is more competition you want more
31:09competition more markets more capitalism
31:11as actually the answer to that precisely
31:13to keep the big companies from basically
31:15just taking over and not having to
31:16compete anymore yeah yeah no no 100% um
31:19and actually that leads very well to the
31:21next question which is how exactly will
31:25markets prevent monopolies please
31:27elaborate on your point and I and and I
31:30think this is so key because look the
31:33nature of companies even
31:35monopolies is that the older and larger
31:39they get the less adaptive they become
31:42and we we look we've seen this with
31:45Google who invented you know most of the
31:47new AI technology and then was a little
31:48bit of sleep at the wheel uh you know
31:51when uh openai released GPT um they
31:57missed it just because like they're big
31:59they're complicated they're slow they're
32:00not as good anymore and so if the new AI
32:04companies are free to compete if open
32:07source AI is free to compete then all of
32:10the sudden that's the best kind of way
32:13to break that Monopoly um you know
32:16similarly you know there's a lot of you
32:18know cheddar on like social networking
32:21monopolies and banking monopolies and
32:23these kinds of things and you know again
32:26we already have a technology that's a
32:29great Insurgent technology to defeat
32:31those monopolies and the thing that
32:34prevents that is as you say uh not a pro
32:38market but a pro business a pro very
32:40specific business businesses that have
32:42enough money to bribe corrupt Lobby the
32:45government into creating regulations
32:47that prevent the new company from
32:50competing right that's right and we're
32:52we're seeing a lot of that right now I
32:54find the heart to be terribly corrupt
32:58with that in mind can you and elaborate
33:00on your statement love doesn't
33:07scale I I don't know if you meant
33:10because the heart is corrupt but perhaps
33:12you do well so I I didn't say the heart
33:15is corrupt so that that was that that's
33:17fortunately that's the questioner um I
33:19mean look I think what the questioner is
33:20alluding to there I would assume is sort
33:21of this you know perennial debate about
33:22human nature which is this man you know
33:24primarily good or bad um which which we
33:27could talk about but I I I think on the
33:28on the thing about love not scaling let
33:30me let me hit that one directly and then
33:31we can maybe go to the go to the bigger
33:32topic so so so basically you know so
33:35this so so the the formulation here is
33:37from a guy named David fredman who's an
33:39economist Milton fredman's son um and
33:42the thing that he said that really stuck
33:43with me is look there's only three ways
33:46to get people to do things for other
33:47people right fundamentally um you know
33:49one is love and you know you see that in
33:51you know you you you you you see that in
33:53you know people's families you know
33:54their in their friend you know friends
33:55friend networks like you know Ben like
33:57I'll do things for Ben without him
33:58having to pay me right um um and so you
34:00know that that that is an important
34:01Force espe without me threatening to
34:05ass coming to the other one yes exactly
34:07so without without Force so so there's
34:09love um uh you know with without love
34:12there's basically two other choices and
34:13they're basically money and force money
34:15right money is the carrot force is the
34:17stick you know look money is
34:18capitalism's answer you know force was
34:20communism's answer yeah um you know and
34:23this is the and again going back to you
34:25know just beat up beat up on the
34:26Communists a little bit more like this
34:27this was this was the big realization of
34:28all the Communist Societies in the in
34:30the 20th century and today which is like
34:32basically what what what what communism
34:34and its derivative socialism and so
34:35forth expect they expect love to scale
34:37yeah um and so they expect that you
34:39should work in whatever you know the
34:41Siberian salt mines or whatever it is um
34:44you know in order the far you know the
34:45fields or whatever um and you should do
34:47that out of love for your fellow man and
34:48you should do that for you know love of
34:49the society and so forth and and by the
34:51way look like you know the na the Nazis
34:53had their own you know the Nazis are the
34:54National Socialist like they had their
34:55own spit on this you're supposed to do
34:56things on behalf of you know the German
34:58people is same thing like you're
34:59supposed to love this macro unit um uh
35:03the the problem because the love doesn't
35:04scale the problem is people just don't
35:06naturally love people they don't know
35:08right and and and by the way that's not
35:09like because I don't think my view
35:11that's not because there's something
35:12morally wrong with them it's because
35:14they don't know the other people right
35:15and and you know it's like okay are they
35:17are they being loved back right and and
35:19and like are the other people going to
35:21be pulling their weight right is is you
35:23know is there going to be a free writer
35:24problem and of course you know the
35:26answer is at any level of scale of
35:28course they they're free riter problems
35:29if if if if people aren't required to
35:30work um and so a a this is the irony of
35:34the heart of the whole thing a society
35:35built on the idea that love scales
35:36becomes an incredibly dark dystopian
35:38hostile you know and ultimately
35:40murderous murderous place because
35:41because love doesn't scale um you know
35:44force force is you know force is one way
35:46around that which is okay the way you
35:47get people to work even though they
35:48don't want to because they don't love
35:49people in some remote area that they'd
35:51be working on behalf of is put a gun to
35:52their head yep um and then and then you
35:55know the the the third option that falls
35:56straight out of that is okay how about
35:57how about money and then and then sort
35:58of money of course is a proxy money is a
36:00tool for you know sort of you know what
36:02they call sort of rational self-interest
36:03right or enlightened self-interest which
36:04was like okay um I'm gonna get paid I'm
36:06gonna get paid money to do this it's
36:07going to benefit these other people I'm
36:09not primarily doing it because it's
36:11going to benefit people I haven't met um
36:13and I don't you know look maybe I love
36:14everybody and maybe I would love to meet
36:16all my customers um and by the way look
36:18when you walk into a restaurant and
36:20you've never you know met the you know
36:21the the owner of the host before like
36:23they're thrilled to see you right and so
36:24you know do they literally love you or
36:26you're their new best friend no are they
36:28excited to see you yeah well they have a
36:29very positive sentiment towards you
36:30because they know you're they know
36:31you're going to pay right um anyway so
36:34that's the that that that is the
36:36solution you know that's the best
36:37solution that we've come up with yeah
36:39you know barring some profound change in
36:40human nature um uh in which people all
36:42of a sudden become far more generous
36:44than they've been historically that that
36:46seems like it's likely to be a stable
36:47State yeah you know it's funny that
36:48reminds me of a interesting conversation
36:51I had years ago with a friend of mine
36:52who grew up in the Soviet Union and uh
36:55you know I I I made some off-hand
36:57comment that you know like whatever
36:59Stalin was a crazy psycho or that kind
37:02of thing and he goes what are you
37:03talking about Stalin was very rational
37:05very smart go back read what he wrote
37:07look at his speeches he was super
37:09systematic thinker very intelligent I
37:12was like well like what went wrong why'd
37:14he kill 20 million of his own people and
37:16he just said when you take away the
37:18carrot all you have is stick and and
37:21that is so true and I think a lot you
37:25know that's a lot of the point
37:27at scale and your family yes you can run
37:29a communist family you can even run a
37:31communist kabut that you know at that
37:34scale it can work for sure yeah that's
37:36right and hi hike made this point and
37:38deud mclust has made this point also
37:39which is like look we we do live in a
37:41superposition of the two systems right
37:43because like you don't want to be you
37:44don't want to be the who like
37:46you know runs your family like it's a
37:47capitalist Enterprise like you know you
37:48charge your kids for you know rents like
37:51you know for sleeping in their in their
37:52in their bunks when they're ate like so
37:55like you know we all we all grow up and
37:58and in our family and friend
37:59environments like we we're we're all
38:01communistic in in that context right but
38:03then there's this kind of you know
38:04schizophrenia to it which is like okay
38:05we got into the world and the world
38:06doesn't act like that and all of a
38:07sudden there's a different a different
38:09way you know way to way to exchange and
38:11a different way to relate um and so I I
38:13think you know ultimately you know the
38:14answer lies in sort of the superp
38:16position of of of of the small and the
38:17large um and uh you know let's just say
38:20yeah the people who are too abstract on
38:22this I think get derailed yeah I I think
38:25that's the exact point that everyone
38:26gets confused on because who would want
38:28the country to run like their family
38:30that would be so much better it's the
38:33impossible y next question um this is
38:38this is actually one that I think a lot
38:39of people have from Morton how do you
38:41see private Capital versus public
38:44research budgets when it comes to
38:46fundamental progress aside from AT&T's
38:49Monopoly that drove B Labs have yet to
38:51see systemic uh technological Pro
38:54progress from private investment
38:57I think I would disagree with the last
38:58part but I'll let you
38:59start yeah so I'd actually say there's
39:02actually three models I would say
39:03there's private there's public and then
39:04there's a third that I'll that I'll come
39:05to um so so look like a couple things I
39:09think are true so so so one is again
39:11I'll give the give the effective
39:12pessimus they're do on this one um also
39:14which is look like yeah look
39:16there straight capital r research you
39:19know that that is sort of the time mon
39:21way of like discovering you know the
39:22principles of the universe and so forth
39:23like you know look you know the the the
39:26best of that has no idea if or when it
39:28will ever commercialize right because by
39:30definition it has no idea whether or not
39:32the experiments will even pan out right
39:33or the theories are even correct and so
39:35you know there is there is this kind of
39:37research that historically you know has
39:38been publicly funded um you know look in
39:41the US we have had these agencies like
39:42the National Science Foundation that
39:43have done a lot of that for a very long
39:45time and have you know very good you
39:46know you look at NSF you know whatever
39:48issues n NSF has if you look at like the
39:52totality of spend in their existence and
39:54then the results you know you would say
39:55yeah that was that was absolutely an
39:56outstanding investment from a societal
39:58standpoint right right very long time
40:00and Investments yeah yeah that's right
40:02and and and yeah again to to to to to Be
40:05an Effective pessimist that was uh you
40:06know a lot of that research may not have
40:08been research that would get done by by
40:09private companies you know look on the
40:11other hand I think private companies
40:12carry their weight more than people
40:14think um and and look part of it is the
40:16the questioner alluded to Bell Labs but
40:18look the there there is this there is
40:19this thing that happens when these
40:21companies you know when the best of the
40:23big companies get big and powerful um
40:25you know do more of this they they they
40:27they open these Labs right um and so
40:29AT&T did it um you know you know IBM and
40:33H Packard at and Google and Microsoft
40:36and many many many others across many
40:37Industries have done this and I think
40:39you know quite honestly part of that is
40:40is is is PR yeah for them they like to
40:42show that off um I think part of that is
40:45um uh you know look they they start to
40:47think in terms of longer time Horizons
40:48and they do want ultimately you know new
40:50products at some point uh part of that
40:52is it's a recruiting and retention
40:53exercise to get the best and brightest
40:54technical Minds to stay there is to tell
40:56them that they can do research and
40:57publish their research and so you know
40:59there is that um and look you know Ben
41:01you mentioned like look the the the
41:02Breakthrough on AI just came out of
41:03Google right it came out of a private
41:04company yep um you know which is a which
41:07is a a great example of that so so yeah
41:10look there's some tension there quite a
41:12few researchers out of Academia to pull
41:14that off they did and in fact that's
41:16been a long running source of tension
41:18actually between the tech companies and
41:19the universities for the last 15 years
41:20or so which is as these new tech
41:21companies have built these big research
41:23labs they have pulled a lot of the good
41:24faculty out of the universities um and
41:26by the way that process I think is
41:27accelerating in AI specifically because
41:30because the the capital needs to to
41:31actually do modern AI are so big that
41:33actually universities can't afford to do
41:34it right now um and so actually there
41:36yeah so you get this kind of issue of
41:37like are you are you sort of are you
41:38draining the universities that they
41:40remaining competent professors um and
41:43and and leaving only only the other kind
41:46um so so yeah so look you and you can
41:48debate we could have a long debate about
41:50the pros and cons of public versus
41:51private um there is a third model and
41:54this is the one that people often talk
41:55about um you know a sort of an
41:57aspiration which is this idea of these
41:59very large societal projects and in in
42:01in the US you know we talk in particular
42:02about the Apollo project in the
42:04Manhattan Project right and we say like
42:06you know this is kind of a frequent
42:07lament from kind of declinist right
42:09which is you know why can't we have more
42:11Apollo projects in Manhattan projects um
42:12and but but I think that's actually a
42:14third model and the third model is is
42:16military um and and and and militaristic
42:19right um and both Apollo and Manhattan
42:21basically were you know like Manhattan
42:24was just a straight up military project
42:25right to build a bomb run by the
42:26military um Apollo was you know had was
42:29a sort of a civilian thing NASA but it
42:31was in the context of a of a military
42:33arms race with the Soviet Union um and a
42:35technological arms race with the Soviet
42:36Union in a in a in a much more
42:38militarized Society than we have today
42:40yeah um and so and of course when the
42:42military gets involved two things happen
42:45one is they can do things at speed when
42:46they really want to because they just
42:47tell people what to do um and then also
42:50you know they they get access to a lot
42:51of money you know when when when things
42:52get when things get rough yeah um and
42:55and so I person I go back and forth a
42:57little bit on this is like do we want
42:58more Manhattans and Apollos on the one
42:59hand it's hard to say no on the other
43:00hand do I want a society that is that
43:02much more militarized you know if that's
43:05required I don't know yeah that could
43:08what what would that mean for what is
43:09happening in the world if we get back to
43:11the point where was milit militarized as
43:12we were in 1941 right or or or even 1960
43:16yeah no it's actually one of the great
43:18things about the modern world is we have
43:20so many fewer Wars and that's a smaller
43:23scale um and every war is
43:26horrible right okay yeah look and
43:29hopefully hopefully that continues right
43:31yes okay here's here's a uh a good
43:34simple question from tux how do you
43:37bridge the gap between being
43:39anti-statist and supporting America
43:42First policies or maybe American
43:45dynamism as we do yeah so I look I think
43:50um all these things are you know end up
43:54being a question of of power and how it
43:58works so in an extreme the the the
44:01extreme for of form of statism is
44:04communism where 100% of the power is in
44:08the public sector um and no power is in
44:10the private sector and you know that has
44:13the issues that we spoke of I think that
44:16um you know no public sector uh can
44:20quickly go to Anarchy and you know
44:23that's not at least not something that I
44:26so I think look as a society you know we
44:30kind of have this Collective you know
44:32there's a certain amount of collectivism
44:34in shared values shared morals um you
44:38know what's okay what's not okay in
44:40America we have things like you know
44:42freedom of speech freedom to be who you
44:44are um you know uh kind
44:48of and and we're very anchored on the
44:51these kind of free Society
44:53ideals and you need you need a state to
44:57do that um I you know at least I believe
45:00I think that there are probably some
45:01people that don't believe that but you
45:02need a state for that and preserving
45:05that th those those values um and that
45:08way of life is extremely important and
45:12that is uh you know largely the role of
45:16you know that that's primarily the role
45:18of the government with all of us
45:19citizens participating and you know we
45:22try to participate in that through you
45:24know our our American dynamism efforts
45:26and that's important so um when when uh
45:30you know when I say I'm anti-status I'm
45:34really saying I'm anti-communist and
45:36anti too much weight on the public
45:38sector at the expense of free markets
45:41and basically Freedom uh for the
45:44citizens yeah Ben I think you you might
45:47you may I think I think you might you
45:49probably mean Bren out um you know
45:50anti-communist also um or by extension
45:53anti-authoritarian yeah for sure anti I
45:55think those go together because it's
45:57hard to be an authoritarian unless you
45:59have an amazing amount of the power
46:01right like it even if you wanted to be
46:05authoritative authoritarian in the US
46:07and we we certainly look at
46:08authoritarian Tendencies from time to
46:11time from various politicians and
46:13leaders um but it's really hard to do
46:16because you don't have enough power to
46:18pull it off uh and that's the greatest
46:20thing about our system I think is you
46:23can't gather enough power uh to be
46:25become completely authoritarian in the
46:27US and and that's you know maybe the
46:30most fundamental thing we want to
46:31preserve Zack T can you further Define
46:35the law of accelerating returns and how
46:37it may play out in the
46:40future yeah so I think it's um so I
46:43think it's uh I never know whether to
46:45use this as a it's such a great metaphor
46:47I'll go ahead and use it so it's a Paul
46:49rer uses this metaphor he says ideas
46:51sex um let's say ideas reproduce they
46:55they they uh they uh they they uh they
46:57go through the same reproduction
46:59evolutionary process as living organ
47:01that's different than having sex by the
47:02way yeah righty now that we have birth
47:05control on all these
47:07things longer long longer longer longer
47:09form is is probably better um so um yeah
47:13so basically what happens basically
47:15basically the argument is basically
47:16ideas beget ideas um uh in quite the
47:20same way that people be get people um uh
47:23so basically it's like the more ideas
47:24that you have the the the more
47:26combinations of ideas you can have and
47:29those combinations of ideas are
47:30themselves ideas um right and then based
47:33on that you know then they they can
47:34further you know kind of replicate you
47:35know they can kind of crossbreed uh and
47:37have and have offsprings and and so and
47:40and and you see this anytime of course
47:41you're building any kind of
47:42technological product is you know you're
47:43pulling in ideas from like all over the
47:45place you know you're getting inspired
47:47by you know all the different
47:48technologies that are below you in the
47:49stack you're getting inspired by all the
47:50other applications anybody has ever
47:52tried to build um you know you're
47:54getting inspired by all kinds of things
47:55you know look a look AI is inspired by
47:57you know the neural structure of the
47:58brain right like you know quite
48:00literally right deriving you know from
48:01from biology cross sort of cross spread
48:03over in a computer science mathematics
48:05um and so basically basically there you
48:07know at at its B if this process is
48:09working the way that it should you
48:10should see it you should see sort of
48:12this accelerating explosion of variety
48:14and sort of speciation and reproduction
48:16and and and scaling of the number of
48:18ideas in the world um you know that sort
48:20of sort of catalytic feeds on itself
48:21like a chain like a chain reaction um by
48:24the way this was also the argument to
48:25connect this back to the idea of human
48:26population so this this was the argument
48:29for this guy Julian Simon who I admire a
48:30lot yeah um who who wrote this book
48:33called The Ultimate resource and you
48:34know he he was all he he and he he did a
48:36lot of his a lot of his work was in the
48:3860s and 70s when there were all these
48:39pitch battles of course around natural
48:41environmentalism um and he was kind of
48:43the aout enemy of the Scott Paul Erick
48:45who was the guy who he was the guy who
48:46predicted you know Mass famine and death
48:48um you know from kind of increases in in
48:50technology um and so Julian Simon said
48:52no actually what you want is you
48:54actually want a lot more people in the
48:55world right hum humans people are the
48:57the ultimate resource right not any kind
48:59of raw material but literally people and
49:01he said why do you want more people in
49:02the world because if you have more
49:03people you'll get more ideas yeah right
49:05um and so more people means you'll have
49:07more ideas more ideas in combination
49:09with ideas leads to more ideas those
49:10ideas lead to ways to make things better
49:12in the world uh among the things that
49:15those ideas make possible are ways to
49:16support more people on the planet right
49:19um and and and so he said like that
49:20quite literally the answer to natural
49:22resource consumption right for example
49:24or natural you know natural resource
49:25whatever limitations or environmental
49:28considerations or whatever the answer is
49:29not the Paul Erick approach of
49:30depopulate right reduce the human
49:32population therefore reduce the number
49:33of ideas uh you know both the number of
49:35people the number of ideas the answer is
49:37to actually put the pedal forward more
49:38people more ideas more solutions um and
49:41you know yes clearly that's that I agree
49:44with him in that argument yeah no
49:47amazing but by the way Julian Simon is
49:51probably uh one of the most underrated
49:54economists and philos philosophers of
49:57the last 100 years so if you're not
49:58familiar with him he's a that's a great
50:00one to read um but a great thing I'll
50:03just on that real quick a great point
50:04that he made that really blows people's
50:05minds when they read it for the first
50:06time is he made the argument that you
50:08never R you never run out of any natural
50:09resource yeah um and he had bet that
50:13right like uh he did he did so he had a
50:15famous he had a 10-year bet with Paul
50:17Erick the population bomb guy um and it
50:19was a bet on the price of a basket of
50:21Comm a basket of natural resource
50:22Commodities 10 years in the future and
50:24er was of course 100% that's when that's
50:26the first peak oil you know yeah the
50:30whole peak oil thing exactly is that
50:31same thing and so he and he Let er
50:33actually Define I believe the the basket
50:34of the of the commodity so he kind of
50:35loaded it in his Direction and and and
50:37the BET was will the price of this
50:38basket be greater or less than it is
50:39today in 10 years and you know everybody
50:42who's kind of in the sort of
50:43conventional thinking on this you know
50:45was like well obviously the prices of
50:46all this stuff are going to go up for
50:47you know are going to go up because
50:48there's more people there's more
50:49consumption and that's everybody wrote
50:51in the the the press for many many years
50:53like like nobody was saying what Julian
50:55Simon SEC right and then the the kicker
50:57is Julian Simon Juan the bet uh the
50:59price of that basket was lower in 10
51:01years and and and and his point the
51:03point was we never run out of Natural
51:04Resources his point was with markets um
51:08the way that we know that a natural
51:09resource is becoming scarce is it price
51:11is its price starts to rise as its price
51:13starts to rise self-interest says we
51:15should figure out ways to not need as
51:16much of that natural resource right
51:18right and so right and so as the price
51:21of oil Rises then all of a sudden we
51:23have an economic incentive to develop
51:24alternative Alternative Energy ranging
51:26by the way from solar to wind through to
51:27things like nuclear and then you know
51:29maybe in the in the future even nuclear
51:30fusion um and so it so it's actually
51:33it's it's markets working at their best
51:34it's as as the price of something Rises
51:36the your incentive your your
51:38self-interested incentive to come up
51:39with the alternative uh uh Rises um and
51:42and and and Alternatives that were not
51:43previously price effective uh actually
51:45become price effective right um so this
51:47was you know this was the rise of
51:48fracking right FR fracking worked
51:51because oil and gas started to get
51:52expensive enough where all of a sudden
51:54the additional cost of fracking was
51:55actually worthwhile right um and so and
51:58then fracking brought the price R back
51:59down um and and and fracking was a
52:01classic example of of an idea it was a
52:02technological innovation born by human
52:05creativity um uh and so basically what
52:07Julian Simon says is that that is how
52:09the system that's the homeostasis kind
52:10of in the system um and this is not a
52:13dystopian scenario in which we are
52:14doomed to run out of everything and
52:15everybody's going to freeze and die in
52:17fact it you know it's not necessarily
52:19utopian like it's still like natural
52:20resources still cost money um but it's a
52:23fundamentally positive view which is it
52:24is human genuity that is going to cause
52:26us to not have the problems that the D
52:28say we're going to have which which and
52:29again you know 300 years of this and you
52:31know so far so good yeah it's a it's a
52:33it's an abundance view of the world as
52:34opposed to a scarcity view of the world
52:36turns out the abundance view is Right
52:38which is good news for all of us yep um
52:41so Matthew has a question which uh I
52:44think you have an unusual answer to
52:46which is is the dream of fusion stopping
52:48us from enjoying the insane gains we can
52:52fishing yeah yes uh yes uh um so well
52:56primarily so primarily what's preventing
52:59us from enjoying the insane gains from
53:00fishing by the way they are insane like
53:01the the level of energy that we could be
53:03producing from Modern nuclear fishing
53:05reactors and the the the and the the the
53:07safety of it is like just doing it right
53:11now yeah yeah yeah France is doing
53:14it by the way FR France just got I mean
53:16the European politics are now so
53:17entertaining on this because France is
53:18so pronuclear and the rest of Europe is
53:20so anti-nuclear France just had to get a
53:22waiver from Germany um to continue to
53:24run their reactors um which they which
53:27they finally just got thing going in
53:29France is their fishing
53:31reactor yeah and the the German greens
53:34you know have been determined for 50
53:35years to turn those reactors off so um
53:38so um uh yeah so look like the the the
53:41big reason why we don't have widespread
53:42nuclear fishing power today is because
53:44the precautionary principle because of
53:45the basically the fear of of of disaster
53:48um and you know which basically makes
53:49people emotional and then and then turns
53:51out here the emotional decision is is a
53:52very is a very very damaging bad
53:54decision because the Alternatives turn
53:56out to be things like gas and
53:58coal um and so um I mean that's that's
54:01overwhelmingly it um now having said
54:03that back back to the question is yeah
54:04so the the new thing that you can say if
54:06you're trying to fight nuclear fishion
54:07is oh we don't need it because we have
54:08Fusion right around the corner um and by
54:10the way look I I I start by saying I
54:12hope that's right I hope Fusion really
54:13is right around the corner um you know
54:15it's been right around the corner for a
54:16while um I hope it's right around the
54:18corner that would be it turns out to be
54:20harder than fishing by quite a bit it's
54:22it's quite difficult and then look I
54:25think what's going to happen is I look I
54:26think we'll get Fusion to work at some
54:28point there's very smart people working
54:29on it um uh I think they'll get it there
54:32um uh I think the same forces and ideas
54:35and people that have prevented the
54:37deployment of nuclear fishing will
54:38immediately prevent the deployment of
54:39fusion um and so I think this idea that
54:42all of a sudden the same kind of bad
54:45narrative uh corrupt
54:49pessimism it'll be the same arguments is
54:51this incredibly dangerous thing and who
54:53knows if it goes wrong and like what if
54:54it this and that the other and like you
54:56know you know we can't take these risks
54:57and can they prove it's going to be you
54:58know safe forever infinitely and it's
55:01going to be the same it's going to be
55:02the same arguments um and they're going
55:04to you know these these nuclear fusion
55:05companies are going to start out being
55:06very optimistic and they're going to hit
55:07this wall of sort of regulatory and
55:09emotional and political you know an
55:11ideological resistance and you know I I
55:14hope they punch through it but you know
55:16look I I would say this like Richard
55:18Rich Richard Nixon did two things in the
55:20early in the early 70s uh around around
55:22this one is he declared something called
55:23project Independence where he said we
55:24were we the US would build a thousand
55:26new nuclear fishing power plants by 1980
55:28become completely energy sufficient be
55:30able to withdraw completely from the
55:31Middle East be able to you know be zero
55:33emission um and then he created the new
55:35nuclear Regulatory Commission which then
55:40dope yeah they they they how many new
55:43nuclear reactors have they approved zero
55:46new plants in 40 years W um
55:52so not good well and again and again
55:55here you you go back to you know sort of
55:57incentives um okay now imagine okay now
56:00you know again get the devil is due like
56:02imagine that you're the new newly
56:04appointed you know regulator you're the
56:05newly appointed chairman you Ben Horwitz
56:07you're the newly appointed chairman of
56:08the nuclear Regulatory Commission in
56:101973 right like what are your incentives
56:14right like how much Glory are you gonna
56:16get right if that somebody if we build
56:18some new reactors versus how horrible is
56:21your life going to be if there's another
56:22nuclear accident yeah right right right
56:24another kind of incentive agency problem
56:26yeah right and then by the way you've
56:28got the existing energy companies in
56:30there doing their thing right saying oh
56:32no you know don't worry about it oil and
56:33gas and by the way we're gonna do you
56:35know D we're gonna clean we're gonna
56:38oil clean oil we're gonna clean you know
56:40the V you know Volkswagen we're we're
56:42gonna have clean diesel right like you
56:43know we're we're just let us you know
56:44continue working on the Clean Coal and
56:46the Clean Diesel and then and then you
56:47know again by the way Fusion's right
56:48around the corner and that'll be better
56:50and then you know you're like wow like
56:51you know these companies like they're so
56:52big and powerful and successful and by
56:53the way they seem like they're kind kind
56:54of hinting there to give me a job when
56:56I'm done here Y and you're you know
56:59you're back you're back to square one
57:00this is a fun one Luke Croft you say
57:02that humans want to be productive yet so
57:04many hate their jobs and work only
57:06because they have to when it comes to
57:09abundance due to Tech Innovation should
57:11we allow people the option of not
57:15working so so there's these two View and
57:18look like again no utopianism here right
57:21like you know look not all jobs are fun
57:22like and look a lot of people work jobs
57:25that you know they really don't like and
57:26they're doing it because they have to
57:27and you know they're doing it because
57:28they're trying to support a family or
57:29support themselves and so you know I
57:31mean certainly that's you know no nobody
57:32nobody promised everybody a job
57:34everybody loves um so so look you know
57:37again you know there sort of a point to
57:39that um I guess what I what I would say
57:41in my analysis is there there's if you
57:43close your eyes and imagine somebody who
57:46doesn't have to work right um and you
57:49know look and I'm not talking about like
57:50you know stay at home others you know
57:52I'm talking about like somebody who like
57:54you know in our system today would be
57:56working to be able to make money support
57:58themselves and then there's a future
57:59ordering of society in which they can
58:00elect not to work and they will have the
58:02you know same or similar level of
58:03materal material comfort that they have
58:05today and they can just kick it um you
58:08kind of close your eyes on that and
58:10imagine that person and there's kind of
58:11two you know possible versions of that
58:13person you can imagine one is the person
58:14who is now you know by the way as Mark
58:17said you know liberated um to you know
58:19what was Mark Mark's whole thing we're
58:21all going to be fishermen in the morning
58:22you know literary critics in the
58:23afternoon poets you know over dinner and
58:25you know musicians at night right like
58:28we're we're going to be able to like you
58:29know self-actualize into you know all of
58:31these things that don't evolve money and
58:33we're going to be you know we're gonna
58:34all be creating this and that and we're
58:35going to be you know doing all these
58:36amazing things um because we have all
58:39this time free um and look there are
58:41some people for whom that's the case
58:43right like that that you know there's
58:44there's certain people for whom I'm sure
58:45that would be the case that look there's
58:47the other scenario and I would just call
58:48that one the Cheetos and meth scenario
58:50right yeah which is like Y and
58:53PlayStation and PlayStation right and
58:56like and I like Netflix I'm a fan of
58:58Netflix but like maybe not 12 hours a
58:59day yeah um right and like sit on the
59:02couch and get baked um and get you know
59:04and just like there goes all your
59:06motivation right straight out the window
59:08um and and this is where I I I you know
59:10I I I brought I use in the in the term
59:12the in the essay The Fairly provocative
59:14term like farm animal like that's a farm
59:16animal existence yeah right that's a
59:18that's the existence of a cow yeah right
59:21and like cows are great cows are great
59:24um but like I don't think we should be
59:26cows like I don't think we should be
59:27farm animals I I don't think that is an
59:29advance uh you know you're back to the
59:31the Wally scenario we were talking about
59:32at that point and I just I think we just
59:34need to be real realistic about that uh
59:36and again this is one of these this is
59:38this is very much one of these luxury
59:39belief things yeah um where kind of the
59:41class of people who imagine themselves
59:44as being the you know poets in the
59:45morning fishermen in the afternoon
59:47musicians at night you know think that
59:49that's a general and maybe that's true
59:50for them by the way maybe if you know
59:51you went to one of these you know ivy
59:52league universities and you hate your J
59:55you know you'd be much happier and you
59:56you'd be writing poetry if you didn't
59:58have to do your day job um you know
01:00:00maybe that's true for them right but
01:00:02like the consequences of that
01:00:04misjudgment if that's wrong on you know
01:00:06the other eight billion people I think
01:00:08are very profound and possibly extremely
01:00:11negative yeah and actually we you know
01:00:13we've run this experiment in the US uh
01:00:15to some extent and I you know I've been
01:00:17studying this more lately but you know
01:00:20perhaps you know there were a lot of uh
01:00:22kind of bad things that happened with
01:00:25uh kind of pilgrims and the Native
01:00:27Americans but I think the worst thing
01:00:30may have been the reservation system at
01:00:31least the longest lasting and the
01:00:33reservation system is essentially Ubi
01:00:36it's $65,000 a year I believe per
01:00:39resident and you know anybody who spent
01:00:41time on a reservation knows like that
01:00:44that just hasn't worked out well like
01:00:45removing purpose from people's lives uh
01:00:50you know generally you know there are
01:00:51people who can deal with it but but
01:00:53that's a they think certainly the
01:00:55minority um and it's you know I I mean
01:00:58at least in my view that's not been
01:01:00great for the Native Americans and uh
01:01:03and to to kind of scale that to society
01:01:05as many are proposing these days um it's
01:01:08probably not the best thing the other
01:01:09thing I'll add is jobs have actually
01:01:11gotten much better over time so not all
01:01:14jobs are great but like like the you
01:01:16know for for many many uh hundreds and
01:01:20you know thousands of years every job
01:01:22was farming and like not every is suited
01:01:25to be a farmer uh I know you weren't
01:01:28mark because there's a lot of farming in
01:01:30your hometown um and then you know the
01:01:34the thing that came after that first was
01:01:36these assembly line jobs which you know
01:01:38it's always kind of amusing to see
01:01:40politicians say oh we need more good
01:01:42jobs like these manufacturing jobs like
01:01:45doing this all day like a robot is not
01:01:49like the greatest job in the world you
01:01:50know it may pay well but it's not the
01:01:52most fulfilling and you know so many of
01:01:54the the people on those lines are on
01:01:56drugs and uh in fact Henry Ford you know
01:01:59famously doubled the minimum wage but
01:02:02the reason he doubled the minimum wage
01:02:03was he had so much attrition because
01:02:06people hated those manufacturing jobs so
01:02:08much and so I think that you know the
01:02:11jobs that we're producing now have been
01:02:13increasingly interesting and you know
01:02:15they're not all great um they're not all
01:02:17great for everybody but one thing is for
01:02:19sure is we have a much broader variety
01:02:21of things that people can do so you have
01:02:23many many choices uh and hopefully
01:02:26hopefully people can find the job that's
01:02:29them how oh this is interesting coin
01:02:32fluence X so that's a pretty neat name
01:02:37how do you Pro the contrast between
01:02:39rigorous technological and mathematical
01:02:41education seen in countries like China
01:02:44which emphasizes Relentless advancement
01:02:47with evolving educational paradigms in
01:02:49the west which prioritizes
01:02:51intersectional pect perspectives and
01:02:55values yeah Ben why don't you uh start
01:02:58that I agree that's a that's an
01:03:00one yeah I I mean I think that
01:03:07um It's Tricky you know like and then
01:03:09you know it's been a long time since
01:03:10I've been in school and and and my
01:03:13children are all adults uh so I'm a
01:03:15little far away from you know the actual
01:03:19thing that's happening in schools now um
01:03:25boy I think that um there's kind of
01:03:30education that you can
01:03:31use uh that's you know where you can go
01:03:35make something or build something or
01:03:37figure out how large systems work or um
01:03:41you know perform like a in demand
01:03:46society and then there
01:03:49are you know a whole class of other
01:03:51things like you can teach people about
01:03:53anything uh you know now they even have
01:03:56you know lots of like rap education in
01:03:58college and this and that and the other
01:04:02and you know it's interesting because
01:04:05the great musicians I
01:04:07know and I know many of them never took
01:04:10any kind of class like that most of them
01:04:13didn't take any class in music by the
01:04:16way you know it's funny like uh my
01:04:18friend Kanye uh taught himself music you
01:04:21know he was an art student he taught
01:04:22himself music uh and I think that you
01:04:26Creative you know the great
01:04:29creatives you know I I I don't know if
01:04:32studying you know to become a virtuoso
01:04:35in those things is is necessarily even
01:04:38the best path and then you have these
01:04:40other things which are these social
01:04:42theories um unproven social theories
01:04:45that you know a lot of people are
01:04:46teaching these days which I I think
01:04:49those are fine Hobbies um and you know
01:04:52you and I pursue them as Hobbies
01:04:55uh studying these social theories and
01:04:57and you know how to you know maybe you
01:05:00know new ideas on how to organize
01:05:02Society or whatever or how that all
01:05:04works I I don't think you know it makes
01:05:07sense to charge people $300,000 or
01:05:12whatever College costs these days to
01:05:14teach them a hobby that like that seems
01:05:16really absurd to me um but uh you know
01:05:21like I I know people have different
01:05:23views on this but I think that uh you
01:05:26know the the purpose of you know the
01:05:29education that we funnel young people
01:05:31into ought to be uh to enable them to
01:05:33make significant contributions back to
01:05:36society and that probably I don't know
01:05:39if it's narrow as the Chinese education
01:05:42but um it's probably not as broad as
01:05:45some of the things that we're
01:05:48doing yeah and then I might just pop the
01:05:50question up one one level which is you
01:05:51know look education itself is an
01:05:53industry it's a you know it's a field
01:05:55and an industry a business um and um you
01:05:59know look both the nature of the
01:06:00American education system and for that
01:06:02matter a system like in China's um you
01:06:04know they're they're primarily
01:06:05centralized and government
01:06:06control um right and so you know the
01:06:09American system is is you know K through
01:06:1012 is primarily a Government monopoly
01:06:12with you know occasional offshoots um
01:06:15and the offshoots are very
01:06:17restricted yeah they're very very and
01:06:19very small uh in number um and then you
01:06:22know look the college system in the US
01:06:24um and you know it's quite literally a
01:06:26self-regulated cartel um by virtue of
01:06:28the fact that it it requires it's on the
01:06:30sort of federal student loan drug um to
01:06:32work financially and and um and then the
01:06:34accreditation agencies that determine
01:06:36which colleges get federal student
01:06:37lending are run by the colleges
01:06:38themselves yeah so it's a it's a
01:06:41self-reinforcing cartel which is why you
01:06:42don't see a lot of universities ever um
01:06:46you know it's why the major universities
01:06:47in the US are older than the country
01:06:48right um and so um and so you know like
01:06:54same thing you know obviously in China
01:06:55the system is you know government
01:06:56government controlled inherently you
01:06:57know completely government controlled
01:06:58and so and that's you know this is
01:06:59commonly true around the world and so
01:07:01you know we've sort of backed into you
01:07:03know we have this idea of Education as
01:07:04like an abstract kind of kind of good
01:07:05and then we have the specific
01:07:07implementation of these highly C you
01:07:08know highly centralized authoritarian
01:07:10non market-based approaches to it and
01:07:12then you know you can argue the pros and
01:07:14cons and the results and the I thought
01:07:16the question set up this you know very
01:07:17interesting aspect of it which is like
01:07:19okay you know when does it become kind
01:07:20of too much rote memorization versus
01:07:22when does it become kind of too much you
01:07:24know kind of let's say um you know kind
01:07:26of you know kind of wild wild theorizing
01:07:29um you know with maybe a Sweet Spot
01:07:31somewhere in the middle um but you know
01:07:33look the the education as a system can't
01:07:35really respond and adapt to changing
01:07:38needs of people because it's primarily
01:07:40not Market based um and so we kind of
01:07:43just get the system we get we send our
01:07:44kids to it we don't think we really have
01:07:46a choice for the most part um and um and
01:07:49you know I would just say like the big
01:07:50thing is it's just in all these
01:07:52societies it's just stuck
01:07:54um and and look and then there are all
01:07:56these signs that it's becoming
01:07:57dysfunctional um you know across across
01:07:59across many societies we we could spend
01:08:01hours on that um and so I I would just
01:08:03at least you know want to want to kind
01:08:05of put a placeholder and say I think we
01:08:06should also squint and kind of Wonder
01:08:08like is there a different kind of system
01:08:09that could get built um that would be
01:08:11much more Market driven uh and much more
01:08:13technologically sophisticated um that
01:08:14would be much more better suited to
01:08:16Modern needs and would be much more
01:08:18subject to you know actual competitive
01:08:19pressure and the need and the need to be
01:08:22good um and you know we you know look
01:08:25there are some very good education
01:08:26startups working on this and there are
01:08:27some very good charter schools and there
01:08:28are some very good new startup
01:08:29universities and so there you know there
01:08:30are some people taking swings at this
01:08:32but um you know at a macro level like
01:08:34we're not there yet and I think maybe we
01:08:35should think over time about trying to
01:08:36get there yeah and there's that
01:08:39that that that's interesting because we
01:08:41do we are in the middle of this very
01:08:42interesting crisis um the student loan
01:08:45crisis where uh you know and President
01:08:49Biden has kind of proposed and I I don't
01:08:52know how much of it it's actually gone
01:08:54through all the legal objections and so
01:08:56forth to kind of forgive some portion of
01:08:59the student loans but the interesting
01:09:02is that that's kind of the ultimate
01:09:06treating the symptom and not the cause
01:09:08because why can't any of these students
01:09:11pay back their loans and the obvious
01:09:13answer is college isn't worth the money
01:09:16you pay for it and then it doesn't
01:09:18translate into a job that enables you to
01:09:20pay back what you borrowed to do it um
01:09:25you know if you think about you know
01:09:27that's a much bigger problem for young
01:09:29people and an ongoing problem that you
01:09:31know like you you either kind of
01:09:33subsidize College in its entirety um or
01:09:38you know you have to fix that problem
01:09:39and I think that uh you know given what
01:09:43colleges are willing to do with their
01:09:45tuition like grow it double the rate of
01:09:48inflation um and hire more
01:09:50administrators than they have students
01:09:52and all these kinds of things if they
01:09:53have free money money uh you know
01:09:54another incentive problem that you you
01:09:57know that it's a pretty dangerous idea
01:09:59to make college free in that way um so
01:10:02you you we do need a better mechanism
01:10:05sure maybe now's the time um
01:10:11okay Sam Arnold to what extent do you
01:10:14believe we really control what
01:10:16technology does after its release can we
01:10:19exert any meaningful control after
01:10:25yeah I mean a couple of things so one is
01:10:26obviously we do we do control I mean
01:10:27there are there are pretty strict
01:10:29controls in a lot of areas right um as
01:10:31as as we just discussed in the nuclear
01:10:32case so you know one is it's not quite
01:10:34that there there aren't any um you know
01:10:36I think the maybe the underlying
01:10:38question here would be can we predict
01:10:40what the consequences are going to be um
01:10:43and you know and by the way this is a
01:10:45very hot topic right now because you
01:10:46have this thing where there are you know
01:10:47a bunch of AI you know kind of
01:10:49practitioners who are making these very
01:10:51in my view very extreme statements about
01:10:52what AI is going to do this going to be
01:10:53horrible um and there's this you know
01:10:55very strong Temptation the part of
01:10:57people to to make what seems like a very
01:10:58logical you know kind of assumption
01:11:00which is that the people who invented
01:11:01the technology are in a are in the best
01:11:03position to be able to predict what
01:11:04happens um with it afterwards and then
01:11:06are are you know presumably also going
01:11:08to be in the best position to proposed
01:11:09the whatever regulations or controls are
01:11:11required to prevent those those things
01:11:12from happening yeah um I have not in my
01:11:16life and I have not in my reading of
01:11:18History seen a lot of examples where the
01:11:20inventors of Technology are very good at
01:11:24you know the the example I like to cite
01:11:26is uh Thomas Edison uh invented the
01:11:28phonograph and he was completely
01:11:30convinced that the use case for the
01:11:31photograph uh was going to be to listen
01:11:33to religious sermons he was a very Pious
01:11:35man um uh and he took religion very
01:11:38seriously um and he just assumed that if
01:11:40you owned a photograph the point of it
01:11:42would be you'd get home at the end of a
01:11:43long day at the office or in the factory
01:11:45you'd kick off your you know you kick
01:11:46off your shoes but in your slippers and
01:11:47you'd have your dog by your side and
01:11:48your kids gathered around and you would
01:11:50put on religious sermons um go church
01:11:53and bother with all those people which
01:11:55for Thomas Edison was a big
01:11:57thing exactly and by the way and by the
01:11:59way right and then and then you can do
01:12:01this every night right you didn't have
01:12:02to wait till Sunday right this is great
01:12:03you can do this all week right and and
01:12:04you know I don't know look look a few
01:12:05people do that um most people don't most
01:12:07people listen to music um and and you
01:12:10know Ben you'll recall you know the you
01:12:12know the what was the first form of
01:12:13music that went you know kind of was
01:12:14kind of the big hit hit hit genre music
01:12:18was Jazz oh Jazz yeah yeah yeah sure
01:12:21sure sure and you'll and you'll recall
01:12:23what people said about jazz at the time
01:12:25well they said many things but it it it
01:12:27wasn't real music it was the it was the
01:12:30devil's music it was very bad it was you
01:12:32know channeling of all kinds of Unholy
01:12:33impulses it was going to cause young
01:12:34people to fornicate like they just had
01:12:36like all all basically all the same
01:12:38things that people say about rap music I
01:12:39was thinking of something that uh it
01:12:41wasn't Quincy Jones it was um I I can't
01:12:44say it because but but there was some
01:12:46racism involved too I'll just say oh of
01:12:49course of course it was absolutely it
01:12:51was it was going to be yeah all of a
01:12:52sudden you it's a channel for these
01:12:53black musicians to show up in in the
01:12:54homes of white people like no no
01:12:55question right and so so so anyway like
01:12:57it it turned out that people had like
01:12:59all kinds of issues with the technology
01:13:00it just turns out they weren't remotely
01:13:01the issues that Thomas Edison predicted
01:13:03that they would have right and and and
01:13:05if you look at this historically you're
01:13:06kind of like well of course he didn't
01:13:07know because like he's just like yeah he
01:13:09he's a he's a he's a techie he just
01:13:11invented the technology like he's not
01:13:14you know he doesn't have like a crystal
01:13:15ball he doesn't have some special
01:13:16foresight and in fact he's a particular
01:13:18kind of person he's the kind of person
01:13:19who spends all his time in a lab yeah
01:13:21right right he's unusual
01:13:24he's unusual right and he let's maybe
01:13:27speculate you know he might be
01:13:28psychologically a little bit different
01:13:29than most people right and he and he and
01:13:31he you know has a gets draw satisfaction
01:13:33from different things and he doesn't
01:13:34spend a lot of time right with like
01:13:36regular people yeah um and you know he's
01:13:38certainly not an expert on like politics
01:13:40and society and you know psychology and
01:13:42sociology and all these all these all
01:13:43these other all these other things and
01:13:45so I I so so anyway I think where I
01:13:47would start back to the original
01:13:48question is just like wow like I
01:13:51actually don't think it's that easy to
01:13:52forecast these things and then
01:13:53specifically I don't think it's it's I
01:13:55don't think specifically it's any easier
01:13:57for the people who invent the technology
01:13:58to forecast these things and I think
01:13:59they carry uh a lot of unjustified
01:14:01credibility this also you know came up
01:14:02in the appenheimer movie you know the
01:14:04same thing it's like these physicists
01:14:05you know all of a sudden right are
01:14:07convinced that they're the ones who are
01:14:07going to be working on like the game
01:14:09theory and the philosophy and the
01:14:10morality of the deployment of nuclear B
01:14:13were well they tried they tried but you
01:14:15know that was the the was a lot of that
01:14:17yeah it was but I mean it was this was
01:14:20the scene with you know Truman which is
01:14:21something that actually happened right
01:14:22which is like Oppenheimer shows up to
01:14:24you know kind of ring his hands about
01:14:25the morality of the atomic bomb and
01:14:26Truman is just like get get this guy out
01:14:28of here like that's you know I'm the
01:14:30elected president of the United States
01:14:32like you know let's listen to me yeah um
01:14:35right and by the way he you know Truman
01:14:36and again I'm not that Truman made all
01:14:37the right decisions or whatever but
01:14:38Truman was the duly elected president of
01:14:39the United States like he is the guy who
01:14:41should have been making that decision
01:14:42not the guy who invented the thing yeah
01:14:44um and so I I just think like I I guess
01:14:46my big appeal here is just like humility
01:14:49like I think we as technologists need to
01:14:50be very careful um we're not but we
01:14:52should be very careful about kind of
01:14:54crossing the line and deciding that
01:14:56we're going to do societal engineering
01:14:57kind of in our spare time um and then
01:14:59you know people who hear technologists
01:15:01kind of doing these things should be
01:15:02very skeptical that the tech that the
01:15:04technologists deserve any kind of
01:15:05unwarranted credibility on this right um
01:15:08and then look there are going to be
01:15:10these big questions uh what history
01:15:12basically says is we figure them out as
01:15:13we go yeah um and I I I think the
01:15:16alternative to that is to just not do
01:15:17new things um and so I you know I know
01:15:19where I come out on that yeah yeah I'll
01:15:21give a plug for uh great book that you
01:15:23recommended when reason goes on holiday
01:15:26which is sort of the story of how the
01:15:29great physicists and uh inventors um did
01:15:33on politics and policy and Game Theory
01:15:36you take super high IQ people and you
01:15:39put them in a research or university
01:15:41setting um they go crazy like on like
01:15:45they very frequently go very nutty on
01:15:48anything involving politics and Society
01:15:49yeah um and you know this was the
01:15:52American physics community in the 1920s
01:15:54and 1930s that basically like most of
01:15:56them went like hardcore communist like
01:15:57Einstein Einstein was a stalinist like
01:16:00we don't you know we don't talk about
01:16:01these things anymore but he was yeah
01:16:03right um you know these people end up
01:16:05like very radicalized and and you know
01:16:08this happens with other groups of people
01:16:09right you know maybe maybe this you know
01:16:11at certain points this happened in in
01:16:12other areas of of of our life but um in
01:16:14our public life but you know it
01:16:16definitely the intellectuals go straight
01:16:17for it um Thomas Sol has talked about
01:16:20this a lot he's like look he's you know
01:16:21basically the problem with people who
01:16:22are hyperverbal and like ideas is they
01:16:24can get kind of arbitrarily unhinged
01:16:25yeah um and they can you know kind of
01:16:27talk themselves into crazier and crazier
01:16:29things um and they you know and their
01:16:30level of disconnection from the from
01:16:32from The Real World means that they you
01:16:33know they no longer have any Governors
01:16:34and how crazy their beliefs can get and
01:16:36so I I think we need to be like I think
01:16:37that's what's happening in AI right now
01:16:39and I think we need to be very cautious
01:16:40about who we listen to yeah yeah trust
01:16:42in their own judgment is is
01:16:44profound um okay last question this is
01:16:48actually one that uh I would have asked
01:16:50it's from bird and it is before
01:16:53publishing what did you consider the
01:16:55most controversial point now that it is