00:00 testimony that's truthful and honest and
00:02 complete let me ask you this Joe Biden
00:04 last year said that xinping was a
00:05 dictator do you agree with Joe Biden is
00:07 xinping a dictator Senator I'm not going
00:09 to comment on any world leaders what why
00:12 won't you answer these very simple
00:13 questions Senator it's not appropriate
00:15 for me as a businessman to come in and
00:16 World lead this are you scared that
00:17 you'll lose your job if you say anything
00:18 about negative about the Chinese
00:20 Communist party I disagree with that you
00:21 will find content that is critical of
00:23 China the next time you go on are you
00:25 scared that you'll be arrested and
00:26 disappear the next time you go to
00:27 Mainland China Senator I you will find
00:29 content is critical of China and any
00:31 other country freely on Tik Tok okay
00:33 okay let's let's turn to what Tik Tok a
00:36 tool of the Chinese Communist party is
00:38 doing to America's youth does the uh
00:41 does the name Mason Eden ring a
00:43 bell uh Senator you may have to give me
00:46 more specifics if you don't mind yeah he
00:47 was a 16-year-old AR Kanan after a
00:49 breakup in 2022 he went on your platform
00:52 and searched for things like
00:53 inspirational quotes and positive
00:55 affirmations instead he was served up
00:58 numerous videos glam memorizing suicide
01:01 until he killed himself by
01:04 gun what about the name Chase
01:06 NASA is that ring a bell would you mind
01:09 giving me more details please he was a
01:11 16-year-old who saw more than a thousand
01:13 videos on your platform about violence
01:15 and suicide until he took his own life
01:17 by stepping in front of a plane or train
01:19 are you aware that his parents Dean and
01:21 Michelle are suing Tick Tock and bite
01:23 dance for pushing their son to take his
01:25 own life uh yes I'm aware of that okay
01:31 finally Mr Chu um has the Federal Trade
01:34 Commission sued Tick Tock during the
01:37 Administration Senator I cannot talk
01:39 about whether there's any are you being
01:41 are you currently being sued by the
01:42 Federal Trade Commission Senator I
01:44 cannot talk about any potential lawsuit
01:47 say potential actual are you being sued
01:48 by the Federal Trade Commission Senator
01:50 I think I've given you my answer I can
01:52 talk about no Miss jino's company is
01:55 being sued I believe Mr Zuckerberg's
01:57 company is being sued I believe yet Tik
02:00 Tock the agent of the Chinese Communist
02:02 party is not being sued by the Biden
02:04 Administration are you familiar with a
02:10 kafara you may have to give me more
02:12 details Christina kafar was a paid
02:14 adviser to bite dance your communist
02:17 influenced parent company she was then
02:20 hired by the Biden FTC to advise on how
02:23 to sue Mr Zuckerberg's
02:26 company Senator B is a global company
02:31 report public reports indicate that your
02:33 lobbyists visited the White House more
02:35 than 40 times in 2022 how many times did
02:38 your company visit did your company's
02:39 lobbyist visit the White House last year
02:42 I I don't know that Senator are you are
02:43 you aware that the Biden campaign and
02:45 the Democratic National Committee is on
02:48 your platform they have Tik Tok accounts
02:50 Senator we encourage people to come on
02:52 to't let them they won't let their
02:54 staffers use their personal phones they
02:56 give them separate phones that they only
02:57 use Tik Tok on we encourage everyone to
02:59 join including so all these companies
03:02 are being sued by the FTC you're not the
03:04 FTC has a former paid advisor your
03:06 parent talking about how they can sue Mr
03:08 Zuckerberg's company Joe Biden's
03:10 re-election campaign and the Democratic
03:12 National Committee is on your platform
03:14 let me ask you have you or anyone else
03:17 at Tik Tock communicated with or
03:20 coordinated with the Biden
03:21 Administration the Biden campaign or the
03:23 Democratic National Committee to
03:25 influence the flow of information on
03:27 platform we work with uh anyone any
03:30 creators who want to use our campaign
03:32 it's it's all the same um process that
03:34 we have okay so what we have here we
03:35 have a company that's a tool of the
03:37 Chinese Communist party that is
03:39 poisoning the minds of America's
03:40 children in some cases driving them to
03:42 suicide and that at best the Biden
03:44 Administration is taking a pass on at
03:46 worst may be in collaboration with thank
03:50 Chu thank you Senator cotton so we're
03:53 going to take a break now we're on the
03:54 second roll call Members can take
03:56 advantage of that they wish the break
03:58 will last about 10 minutes minutes
04:00 please do your best to
04:12 to I think these guys
04:16 are excuse us sorry
06:21 and if we break through none of this
10:55 uh we've been going all through Los
13:13 oh you can I can give you my
13:46 yes please watch it you'll find it on
13:50 website I will thank you
23:06 Senate Judiciary Committee will resume
23:09 we have nine Senators who have not uh
23:11 asked questions yet in seven minute
23:13 rounds and we'll uh turn first to
23:18 Pia thank you Mr chair um colleagues as
23:22 we reconvene uh I'm proud once again to
23:25 uh uh share that I am one of the few
23:30 senders with younger children uh and I
23:34 lead with that because as we are having
23:37 this conversation today uh it's not lost
23:39 on me that uh between my children who
23:41 are all now in the teen and pre-teen
23:43 category uh and their
23:46 friends uh I see this issue very up
23:51 personal um and in that spirit I want to
23:53 take a second to just acknowledge and
23:55 thank all the parents who who are in the
23:59 audience today many of whom have shared
24:00 their stories with our offices and I
24:04 credit them uh for um finding strength
24:08 through their suffering through their
24:10 struggle and channeling that into the
24:14 advocacy that is making a difference I
24:17 thank all of you um now I appreciate
24:21 again personally the challenges that
24:25 caretakers uh School personnel and
24:28 others face in helping our young people
24:30 navigate this uh world of social media
24:35 and technology in general now the
24:37 services our children are growing up
24:39 with uh provide them unrivaled access to
24:43 information I mean this is beyond what
24:44 previous generations uh have experienced
24:48 and that includes learning opportunities
24:50 socialization and much much more but we
24:53 also clearly have a lot of work to do to
24:57 better protect protect our children from
25:00 the Predators and predatory behavior
25:03 that these technologies have
25:06 enabled and yes Mr Zuckerberg that
25:08 includes exacerbating the Mental Health
25:14 America nearly all teens we know have
25:17 access to uh smartphones and the
25:20 internet uh and use the internet daily
25:24 and while Guardians do have primary
25:27 responsibility for caring for our
25:29 children the old addage says uh it takes
25:32 a village uh and so society as a whole
25:37 including leaders in the tech
25:40 industry must prioritize the health and
25:44 children now dive into my questions now
25:47 and be specific platform by platform
25:49 Witness by witness on the topic of some
25:53 of the parental tools you have each made
25:55 reference to Mr citen how many miners
25:58 are on Discord and how many of them have
26:00 caretakers that have adopted your family
26:02 center tool and if you don't have the
26:04 numbers just say that quickly and uh
26:06 provide that to our office uh we can
26:08 follow up with you on
26:10 that how have you ensured that young
26:13 people in the Guardians are aware of the
26:15 offer um we make it very clear to use to
26:19 to teens on our platform what tools are
26:21 available and our te Sounds by what
26:23 specifically do you do what that what
26:26 may be clear to you is not clear the
26:27 general public so what do you do in your
26:29 opinion to make it very clear uh so our
26:31 teen safety assist which is a feature
26:33 that um helps uh teens keep themselves
26:36 safe in addition to blocking and
26:38 blurring images that may be sent to them
26:39 that is on by default for teen accounts
26:41 and it cannot be turned off we also have
26:44 we we market and to our uh teen users
26:47 directly in our platform we launched our
26:49 family center we created a promotional
26:51 video we put it directly on our product
26:52 so when every Teen um opened the app in
26:55 fact every user opened the app they got
26:57 an alert like hey hey Discord has this
26:58 um they they want you to use it thank
27:00 you look forward to the the data that
27:01 we're requesting for Mr Zuckerberg
27:03 across all of meta services from
27:05 Instagram Facebook messenger and Horizon
27:08 uh how many miners use your applications
27:10 and of those miners how many have a
27:12 caretaker that has adopted the parental
27:15 supervision tools that you
27:18 offer sorry I can follow up with the
27:20 specific stats on that okay it would be
27:21 very helpful not just for us to know but
27:23 for you to know as a leader of your
27:26 company uh and how same question how are
27:28 you ensuring that uh young people and
27:30 their Gardens are aware of the tools
27:32 that you offer uh we run pretty
27:35 extensive ad campaigns both on our
27:37 platforms and outside we work with
27:38 creators and organizations like Girl
27:41 Scouts to make sure that this is broadly
27:43 a that there's broad awareness of the
27:45 tools okay Mr spel how many minors use
27:48 Snapchat and of those minors how many
27:51 have caretakers that are registered with
27:52 your family center Senator I believe
27:55 approximately in the United States there
27:57 are approximately 20 million uh Teenage
27:59 users of Snapchat I believe
28:00 approximately 200,000 parents use Family
28:03 Center and about 400,000 teens have a
28:05 linked their account to their parents
28:07 using Family Center so 200,000 400,000
28:09 sounds like a big number but small in
28:11 percentage of the minors using SnapChat
28:14 uh what are you doing to ensure that
28:15 young people in their Gardens are aware
28:16 of the tools you offer Senator we uh
28:19 create a banner for Family Center on the
28:21 users profile so that accounts We
28:23 Believe maybe of the age that they could
28:24 be parents can see uh the the entry
28:27 Point into Family Center easily okay uh
28:30 Mr Chu how many miners are on Tik Tok
28:32 and how many of them have a caregiver
28:33 that uses your family tools Senator I
28:36 need to get back to you on the specific
28:38 numbers um but we were one of the first
28:40 platforms to give what we call Family
28:42 pairing to parents you go to settings
28:44 you turn on a QR code your teenager's QR
28:46 code and yours you scan it and what it
28:48 allows you to do is you can set screen
28:50 time limits you can um filter out some
28:52 keywords you can turn on the more
28:54 restricted mode and we're always talking
28:56 to parents um I'm I met a you know a
28:58 group of parents and teenagers and the
28:59 high school teachers last week to talk
29:01 about what more we can provide in the
29:03 family pairing mode Miss jarino how many
29:06 miners use x and are you planning to
29:08 implement safety measures or guidance
29:10 for caretakers like uh your pure
29:13 companies have thank you Senator less
29:16 than 1% of all us users are between the
29:19 ages of 13 and 17 less than 1% of how
29:23 many of 90 million us users okay so
29:26 still hundreds of thousands continue yes
29:29 yes and every single one is very
29:31 important uh being a 14-month-old
29:33 company we have reprioritized child
29:36 protection and safety measures and we
29:38 have just begun to talk about and
29:41 discuss how we can enhance those with
29:44 controls let me uh continue with a
29:46 followup question for um Mr Citron in
29:51 addition keeping parents informed about
29:52 the nature of various internet services
29:55 there's a lot more we obviously need to
29:57 do do for today's purposes while many
29:59 companies offer a broad range of quote
30:01 unquote user empowerment tools it's
30:04 helpful to understand whether young
30:05 people even find these tools helpful so
30:08 appreciate you sharing your teen safety
30:10 assist on the tools and how you're
30:12 advertising it but have you conducted
30:14 any assessments uh of how these features
30:17 are impacting miners use of your
30:21 platform Our intention is to is to give
30:24 teens tools capabilities that they can
30:26 use to keeps s safe and also so our
30:28 teams can help keep teens safe um we
30:31 recently launched teen safety assist
30:32 last year and we I I do not have um a
30:34 study off the top of my head but we'd be
30:35 happy to follow up with you on that okay
30:37 uh my time is up I'll have followup
30:38 questions for uh each of you either in
30:40 the second round or through statements
30:42 for the record on a a similar assessment
30:45 of the tools that you've proposed thank
30:47 you Mr chair thank you Senator Pia
30:52 Kennedy thank you all for being
31:00 Spiegel I see you hiding down
31:04 there what does y yada y
31:09 mean I'm not familiar with the term
31:16 uncool can we agree that what you do not
31:21 say what you do is what you believe and
31:24 everything else is just cottage cheese
31:30 yes Senator you agree with that speak up
31:36 shy I I I've listened to to you today
31:41 I've heard a lot of yach yada
31:44 ying and I've heard you talk about the
31:46 reforms you've made and I appreciate
31:50 them and I've heard you talk about the
31:52 reforms you're going to
31:58 I don't think you're going to solve the
32:00 problem I think Congress is going to
32:03 you I think the reforms you're talking
32:06 about to some extent are going to be
32:09 putting putting paint on rotten
32:12 wood and I'm not sure you're going to
32:14 support this legislation I'm
32:17 not um the the fact is that you and some
32:21 of your internet colleagues who are not
32:24 here are no longer you're you you're not
32:29 countries you're you're very very
32:33 powerful and you and some of your
32:36 colleagues who are not
32:37 here have blocked everything we have
32:42 do in terms of reasonable
32:46 regulation everything from privacy to
32:52 exploitation and um in fact we we have a
32:56 new def definition of
32:59 recession um a recession is when we know
33:02 we're in a recession when Google has to
33:04 lay off 25 members of
33:07 Congress that's what we're down
33:10 to we're also down to this fact that
33:13 your platforms are hurting
33:15 children I'm not saying they're not
33:18 doing some good things but they're
33:20 children and I know how to count votes
33:23 and if this bill comes to the floor of
33:25 the United States Senate it will pass
33:28 what we're going to have to do and I say
33:31 this with all the respect I can muster
33:33 is convince my good friend Senator
33:37 to to go to Amazon buy a spine online
33:41 and bring this bill to the senate
33:44 floor and uh the house will then pass
33:49 it now that's that's one person's
33:53 opinion I may be wrong but I doubt about
33:58 it uh Mr Zuckerberg let me ask you a
34:01 couple of questions let's I might wax a
34:03 little philosophical here
34:07 um I have to hand it to
34:11 you uh you you have
34:14 um you have convinced over two billion
34:19 people to give up all of their personal
34:22 information every bit of
34:28 see what their high school friends had
34:31 for dinner Saturday
34:33 night that's pretty much your business
34:37 it it's not how I would characterize it
34:40 and we give people the ability to
34:41 connect with the people they care about
34:43 and um and to engage with the topics
34:46 that they care about and you and you
34:51 information this abundance of personal
34:54 information and then you develop
34:58 to punch people's hot
35:02 buttons which and send and and steer to
35:05 them information that punches their hot
35:08 buttons again and again and again to
35:11 keep them coming back and to keep them
35:14 longer and as a result your users see
35:19 only one side of an
35:21 issue and so to some extent your
35:25 platform has become a killing feeli for
35:27 the truth hasn't it I mean Senator I
35:30 disagree with that that characterization
35:33 um you know we build ranking and
35:35 recommendations because people have a
35:37 lot of friends and a lot of interests
35:39 and they want to make sure that they see
35:41 the content that's relevant to them um
35:43 we're trying to make a product that's
35:44 useful to people and and make our
35:46 services um as helpful as possible for
35:48 people to connect with the people they
35:49 care about and the interests they care
35:51 about but you don't show them both sides
35:53 you don't give them balanced information
35:55 you just keep punching in their hot
35:57 buttons punch in their hot buttons you
35:59 don't show them balanced information so
36:02 people can discern the truth for
36:04 themselves and and you rev them up so
36:07 much that that so often your platform
36:11 others becomes just cess pools of
36:15 snark where nobody learns anything don't
36:18 they well Senator I disagree with that I
36:21 think people can engage in the things
36:22 that they're interested in um and learn
36:25 quite a bit about those we have done a a
36:28 handful of different experiments and
36:30 things in the past around news and
36:32 trying to show content on you know
36:34 diverse set of of of perspectives I
36:37 think that there's more that needs to be
36:39 explored there but I don't think that we
36:42 ourselves do you think I'm sorry to cut
36:44 you off Mr Mr President but I'm going to
36:47 run out of time do do you think your
36:50 users really understand what they're
36:53 giving to you all their personal
36:55 information and how you how you process
36:58 it and how you monetize it do you think
37:00 people really understand uh Senator I
37:04 think people understand the basic terms
37:08 I mean I think that there's I actually
37:10 think that a lot of people information
37:12 been a couple years since we talked
37:14 about this does your user agreement
37:18 suck I I'm not sure how answer thator
37:21 can you still have can you still have a
37:23 dead body and all that legal Lee
37:27 where nobody can find it Senator I'm not
37:30 I'm not quite sure what you're referring
37:31 to but I think people get the basic deal
37:34 of using these Services it's a free
37:35 service you're using it to connect with
37:37 the people you care about if you share
37:39 something with people other people will
37:41 be able to see your information it's
37:42 it's inherently you know if you're
37:44 putting something out there to be shared
37:45 publicly um or with a private set of
37:47 people it's you know you're inherently
37:49 putting it out there so I think people
37:51 get that basic part of how Mr Zuckerberg
37:54 you're in the foothills of creepy you
37:56 you track you track you track
37:59 people who aren't even Facebook users
38:03 you track your own people your own users
38:06 who are your product even even when
38:12 Facebook I mean I'm I'm going to land
38:14 This Plane pretty quickly Mr chairman I
38:17 I I mean it's creepy and I understand
38:19 you make a lot of money doing
38:22 it but I just wonder if if our
38:28 is greater than our Humanity I mean let
38:31 me ask you this final question
38:37 harmful to young people isn't
38:40 it Senator I disagree with that that's
38:42 not what the research shows on balance
38:45 that doesn't mean that individual people
38:46 don't have issues and that there aren't
38:48 things that that we need to do to to
38:50 help provide the right tools for people
38:52 but across all the research that we've
38:54 done internally I I mean this this the
38:57 uh you know survey that uh the senator
39:00 previously cited um you know there are
39:03 12 or 15 different categories of harm
39:06 that we asked um teens if they felt that
39:09 Instagram made it worse or better and
39:12 across all of them except for the one
39:13 that that that um that Senator Holly
39:15 cited um more people said that using
39:18 Instagram issu they face Zu either
39:21 positive or let me we're just going have
39:23 to agree to disagree if if you believe
39:25 that Instagram I it's I'm not saying
39:27 it's intentional but if you agree that
39:30 Instagram if you think that Instagram is
39:32 not hurting millions of our young people
39:35 particularly young teens particularly
39:37 young women you shouldn't be
39:47 Butler thank you Mr chair and um thank
39:50 you to um our panelists who've come to
39:54 uh have an important conversation ation
39:56 with us most importantly I want to
39:58 appreciate the families uh who have uh
40:01 shown up to continue to be remarkable um
40:04 champions of your children and your
40:06 loved ones for um being here and in
40:09 particular two California families um
40:12 that I was able to just talk to on on
40:14 the break the families of Sammy Chapman
40:17 from Los Angeles and Daniel perta uh
40:20 from Santa Clarita uh they are are here
40:23 today and are doing some incredible work
40:26 uh to not just protect the memory and
40:29 Legacy of their boys um but the work
40:32 that they're doing is going to protect
40:34 my nine-year-old uh and that is uh
40:37 indeed why we're here there are a couple
40:39 questions that I want to ask um some
40:42 individuals let me start with a question
40:44 for each of you uh Mr Citron have you
40:48 ever sat with a family and talked about
40:50 their experience and what they need from
40:52 your product yes or
40:54 no uh yes I have spoken with parents
40:57 about how we can build tools to help
40:59 them Mr Spiegel have you sat with
41:01 families and young people to talk about
41:03 your products and what they need from
41:04 your product yes Senator Mr Shu yes I
41:09 just did it two weeks ago for example I
41:12 don't want to know what you did for the
41:13 hearing prep Mr Chu I just wanted to
41:15 know if anything in terms of Prov
41:20 designing the product that you are
41:25 creating Mr Zuckerberg um have you sat
41:29 with parents and young people to talk
41:30 about how you design product uh for uh
41:33 your cons for your uh consumers yes over
41:37 the years I've had a lot of
41:38 conversations with parents you know
41:40 that's interesting Mr Zuckerberg because
41:41 we talked about this last night and you
41:43 gave me a very different
41:45 answer I asked you this very
41:49 question well I I told you that I wasn't
41:53 that I didn't know what specific
41:55 processes are company has no Mr
41:57 Zuckerberg you said to me that you had
42:01 not I I must have misspoke I I want to
42:04 give you the room to missp misspeak Mr
42:07 Zuckerberg but I asked you this very
42:09 question I asked all of you this
42:11 question uh and you told me a very
42:14 different answer when we spoke but I
42:17 won't belabor it can I um a number of
42:20 you have talked about the I'm sorry ex
42:23 uh Miss sharino have you talked to
42:25 parents directly young people but about
42:29 product as a new leader of X the answer
42:32 is yes I've spoken to them about the
42:35 behavioral patterns e because less than
42:39 1% of our users are in that age group
42:42 but yes I have spoken to them thank you
42:44 ma'am Mr Spiegel um there are a number
42:48 of parents who've uh children have been
42:51 able to access uh illegal drugs on your
42:54 platform what do you say to those
42:59 parents well Senator we are devastated
43:02 that we cannot to the parents what do
43:04 you say to those parents Mr Spiegel I'm
43:07 so sorry that we have not been able to
43:09 prevent these tragedies we work very
43:11 hard to block all Search terms related
43:14 to drugs from our platform we
43:15 proactively look for and detect drug
43:17 rated content we remove it from our
43:19 platform preserve it as evidence we and
43:21 then we refer it to law enforcement uh
43:23 for Action we've worked together with
43:26 nonprofits and with families on
43:27 education campaigns because the scale of
43:29 the fenel epidemic is extraordinary over
43:31 100,000 people lost their lives last
43:33 year and we believe people need to know
43:35 that one pill can kill that campaign
43:37 reached more than 200 was viewed more
43:38 than 260 million times on Snapchat we
43:42 also there are two fathers in this room
43:44 who lost their sons they're 16 years
43:48 old they're children were able to get
43:56 I know that there are statistics and I
43:58 know that there are good efforts none of
44:01 those efforts are keeping our kids from
44:04 getting access to those drugs on your
44:06 platform uh as uh California company all
44:09 of you I've talked with you about what
44:11 it means to be a good neighbor and what
44:12 California families and American
44:14 families should be expecting from you
44:17 you owe them more than just a a set of
44:20 Statistics uh and I look forward to you
44:22 showing up on all pieces of this
44:24 legislation all of you showing up on all
44:27 pieces of legislation to keep our
44:28 children safe Mr Zuckerberg I want to
44:30 come back to you I um talked with you
44:36 parent to a young child um who's doesn't
44:40 have a phone doesn't you know is not on
44:43 social media at all um and one of the
44:47 things that I am deeply concerned with
44:49 uh as uh a parent to a young black girl
44:55 is the utilization of uh filters on your
45:01 platform that would suggest to young
45:05 girls utilizing your platform the
45:08 evidence that they are not good enough
45:15 to ask more specifically and refer to
45:19 some unredacted court documents that
45:22 reveal that your own
45:24 researchers uh concluding that these
45:26 face filters that mimic plastic
45:30 surgery negatively impact youth mental
45:33 health indeed uh and well-being why
45:38 believe why should we believe that
45:40 because that you're going to do more to
45:44 protect young women and young girls when
45:47 it is that you give them the tools to
45:50 affirm the self-hate that is spewed
45:53 across your platforms why is it that we
45:55 should believe that you are committed to
45:58 doing anything more to keep our children
46:01 safe sorry there's a lot to unpack there
46:03 there people tools to express themselves
46:06 in different ways and people use face
46:09 filters and different tools to make
46:12 media and photos and videos that are fun
46:15 or interesting um across a lot of the
46:18 different products that that that that
46:19 are plastic surgery pins are good tools
46:23 creativity um Senator I'm not speaking
46:27 to that skin lightening tools are tools
46:30 to express creativity this is the direct
46:32 thing that I'm asking about not
46:34 defending any specific one of those I
46:36 think that the ability to kind of filter
46:41 and um and edit images is generally a
46:44 useful tool for expression for that
46:47 specifically I'm I'm not familiar with
46:48 the study that you're referring to but
46:50 we did make it so that we're not
46:52 recommending this type of content to
46:55 teams no no reference to a study to
46:58 court documents that revealed your
47:01 knowledge of the impact of these types
47:04 of filters on young people generally
47:07 young girls in particular I disagree
47:09 with that characterization I I think
47:11 that there's hyp cour documents I'm I
47:14 haven't seen any document that says okay
47:16 M Mr Zuckerberg my my time is up um I
47:20 hope that you hear what is being offered
47:22 to you and are prepared to step up and
47:25 do better I know this senate committee
47:28 uh is going to do our work to hold you
47:30 in great to Greater account thank you Mr
47:35 Tillis thank you Mr chair thank you all
47:38 for being here the um I I don't feel
47:41 like I'm going to have an opportunity to
47:43 ask a lot of questions so I'm going to
47:45 reserve the right to submit some for the
47:47 record but I I have heard we've had
47:51 hearings like this before I've been in
47:53 the senate for nine years I've heard
47:55 heard hearings like this before I've
47:57 heard horrible stories about uh people
48:00 who have died committed suicide uh been
48:04 embarrassed um every year we have an
48:07 annual flogging every year and what
48:11 material has occurred over the last nine
48:14 years um do any of you all do just yes
48:19 or no question do any of y'all
48:20 participate in an industry Consortium
48:22 trying to make this fundamentally safe
48:28 Zuber there's a variety of
48:32 organizations which organiz I should say
48:34 does anyone here not participate in an
48:37 industry if I I actually think it would
48:39 be imoral for you all to consider it a
48:42 strategic advantage to keep safe or to
48:45 keep private something that would secure
48:48 all these platforms to avoid this sort
48:50 of pro do you all agree with that that
48:51 anybody that would be saying you want
48:53 ours because ours is the safest and
48:55 these haven't figured out the secret
48:56 sauce that you as an industry realize
48:58 this is an existential threat to you all
49:00 if we don't get it right right I mean
49:01 you've you've got to secure your
49:03 platforms you got to deal with this do
49:04 do you not have an inherent mandate to
49:08 do this because it would seem to me if
49:10 you don't you're going to cease to exist
49:13 I mean we could regulate you out of
49:15 business if we wanted to and the reason
49:17 I'm saying it may sound like a criticism
49:19 it's not a criticism I think we have to
49:21 understand that there should be an
49:23 inherent motivation get this right our
49:26 Congress will make a decision that could
49:28 potentially put you out of business
49:30 here's the reason I have a concern with
49:31 that though I I just went on the
49:33 internet uh while I was listening
49:35 intently to all the other members
49:37 speaking and I found a dozen different
49:41 uh platforms outside the United States
49:44 10 of which are in China two of which
49:46 are in in Russia uh their daily average
49:49 subscri or active membership numbers in
49:52 the billions well people say you can't
49:54 get on China's version of Tik Tok I took
49:58 me one quick search on my favorite
50:01 search engine to find out exactly how I
50:03 could get a an account on this platform
50:08 today um and so the other thing that we
50:11 have to keep in mind I come from
50:13 technology I could figure out ladies and
50:15 gentlemen I could figure out how to
50:17 influence your kid without them ever
50:19 being on a social media platform I can
50:22 randomly send text and get a bite and
50:24 then find out an email address and get
50:28 information um if we're it is horrible
50:31 to hear some of these stories and I have
50:33 shared the and I've had these stories
50:35 occur in my hometown down in North
50:38 Carolina but if we only come here and
50:41 make a point today and don't start
50:43 focusing on making a difference which
50:45 requires people to stop shouting and
50:48 start listening and start passing
50:51 language here the Bad actors are just
50:53 going to be off our Shores I have
50:55 another question for you all how much do
50:57 how many people roughly if you don't
50:59 know the exact numb okay roughly how
51:00 many people do you have looking 24 hours
51:03 a day at these horrible images and just
51:06 go real quick with an answer down the
51:07 line and filtering it out um it's it's
51:10 most of the 40,000 about people who work
51:13 on safety and again we have 2,300 people
51:16 all over the world okay we have 40,000
51:19 trust and safety professionals around
51:22 world we have approximately 2,000 people
51:24 dedicated to trust and safety and
51:27 moderation um our our platform is much
51:30 much smaller than these folks we have
51:31 hundreds of people and it's um looking
51:33 at content 50% of our work I've mention
51:35 these people are have a horrible job
51:37 many of them experience um they they
51:41 have to get counseling for all the
51:42 things they see we have evil people out
51:45 there and we're not going to fix this by
51:47 shouting past or talking past each other
51:49 we're going to fix this by every one of
51:51 y'all being at the table and hopefully
51:52 coming closer to what I heard one person
51:55 and say supporting a lot of the good
51:57 bills like one that I hope Senator
51:58 Blackburn mentions when she gets a
52:00 chance to talk but guys if you're not at
52:03 the table and securing these platforms
52:05 you're going to be on it and and and the
52:07 reason why I'm not okay with that is
52:10 that if we ultimately destroy your
52:13 ability to create value and drive you
52:15 out of business the evil people will
52:17 find another way to get to these
52:20 children and I do have to admit I don't
52:23 think my mom's watching this one but
52:25 there is good we we can't look past good
52:27 that is occurring my mom who lives in
52:30 Nashville Tennessee and I talked
52:31 yesterday and we talked about a Facebook
52:33 post that she made a couple of days ago
52:36 we don't let her talk to anybody else
52:38 that that that connects my 92y old
52:40 mother with uh with her grandchildren
52:43 and great-grandchildren that lets a kid
52:45 who may feel awkward in school to get
52:47 into a group of people and relate to
52:49 people let let's not throw out the good
52:53 because we have it all together focused
52:56 on rooting out the bad now I guarantee
52:59 you I could go through some of your
53:00 governance documents and find a reason
53:02 to fog every single one of you because
53:04 you didn't place the emphasis on it that
53:07 I think you should but at the end of the
53:09 day I F it find it hard to believe that
53:11 any of you people started this business
53:13 some of you in your college dorm rooms
53:15 for the purposes of creating the evil
53:17 that is being perpetrated on your
53:19 platforms but I hope that every single
53:22 waking hour you're doing everything you
53:25 can to reduce it you're not going to be
53:28 able to eliminate it and I hope that
53:30 there are some enterprising young tech
53:32 people out there today that are going to
53:34 go to parents and say ladies and
53:36 gentlemen your children have a deadly
53:39 weapon they have a potentially deadly
53:42 weapon whether it's a phone or a tablet
53:47 you have to secure it you can't assume
53:50 that they're going to be honest and say
53:51 that they're 16 when they're 12 uh we
53:55 all have to recognize that we have a
53:57 responsibility to play and you guys are
54:00 at the tip of the spear so I hope that
54:03 we can get to a point to where we are
54:05 moving these bills if you got a problem
54:08 with them State your problem let's fix
54:10 it no is not an answer uh and and know
54:14 that I want the United States to be the
54:17 beacon for Innovation to be the beacon
54:19 for safety and to prevent people from
54:22 using other options that have existed
54:24 since the internet has existed to
54:27 exploit people and Count Me In as
54:29 somebody that will try and help out
54:31 thank you Mr chair thank you Senator
54:33 Tillis next is Senator oof thank you Mr
54:36 chairman and uh thank you to our
54:42 uh Mr Zuckerberg I want to begin by just
54:44 asking a simple question which is do you
54:46 want kids to use your platform more or
54:49 less well we don't want people under the
54:51 age of 13 using you want teenagers 13
54:54 and up to use your platform more or less
54:58 um well we would like to build a product
55:00 that is useful and that people want to
55:01 use more my time is is going to be
55:03 limited so it's just do you want them to
55:05 use it more or less teenagers 13 to 17
55:09 years old do you want them using meta
55:10 products more or less I'd like them to
55:13 be useful enough that they want to use
55:15 them more you want them to use it
55:19 more I think here in we
55:22 have one of the fundamental challenges
55:25 in in fact you have a fiduciary
55:27 obligation do you not to try to get kids
55:30 to use your platform
55:32 more it depends on how you define that
55:35 um we we obviously are a business um but
55:38 it's it I'm Sor Mr Zuckerberg it's just
55:40 our time is it's it's not it's
55:42 self-evident that you have a fiduciary
55:44 obligation to get your users including
55:47 users under 18 to use and engage with
55:49 your platform more rather than less
55:53 correct over the long term term but in
55:55 the near term we often take a lot of
55:57 steps including we we made a change to
56:00 show less videos that that on the
56:02 platform that reduced amount of time by
56:04 more than 50 million hours but if your
56:05 shareholders ask you
56:08 Mark I wouldn't Mr Zuckerberg here but
56:11 your shareholders might be on a
56:11 firstname basis with you mark are you
56:14 trying to get kids to use meta products
56:16 more or less you'd say more right well I
56:19 would say that over the long term we're
56:20 trying to create the most let's look so
56:23 the 10K you file with the SEC a few
56:24 things I want to note here are some
56:26 quotes and this is a a filing that you
56:28 sign correct yes yeah our financial
56:31 performance has been and will continue
56:33 to be significantly determined by our
56:35 success in adding retaining and engaging
56:37 active users here's another quote if our
56:40 users decrease their level of Engagement
56:42 with our products our Revenue Financial
56:44 results in business may be significantly
56:45 harmed here's another quote we believe
56:47 that some users particularly younger
56:49 users are aware of and actively engaging
56:52 with other products and services similar
56:53 to as a substitute for ours
56:55 continues in the event that users
56:57 increasingly engage with other products
56:58 and services we may experience a decline
57:00 in use and engagement in key
57:01 demographics or more broadly in which
57:03 case our business would likely be harmed
57:08 obligation as the chief
57:10 executive to encourage your team to get
57:18 more sen this is is that not
57:21 self-evident you have a fiduciary
57:23 obligation to your shareholders to get
57:25 kids to use your platform more I I think
57:26 that the thing that's not intuitive is
57:29 the the direction is to make the
57:32 products more useful so that way people
57:34 want to use them more we don't give our
57:36 the teams running the Instagram feed or
57:38 the Facebook feed a goal to increase the
57:41 amount of time that people spend yeah
57:42 but you don't dispute and your and your
57:43 10K makes clear you want your users
57:46 engaging more and using more the
57:48 platform and I think this gets to the
57:50 root of the challenge
57:52 because it's the overwhelming view of
57:54 the the public certainly in my home
57:56 state of Georgia uh and we've had some
57:59 discussions about the underlying science
58:01 that this platform is
58:03 harmful for children I mean you are
58:06 familiar with and not just your platform
58:09 by the way social media in general 2023
58:12 report from the Surgeon General about
58:13 the impact of social media on kids
58:15 mental health which cited evidence that
58:17 kids who spend more than three hours a
58:18 day on social media have double the risk
58:20 of poor mental health outcomes including
58:22 depression and anxiety you're familiar
58:23 with that Surgeon General report the
58:25 underlying study I I read the report yes
58:28 do you dispute it no but I think it's
58:31 important to characterize it correctly I
58:32 think what he was flagging in the report
58:34 is that there seems to be a correlation
58:37 and obviously the mental health issue is
58:39 very important so it's something that
58:40 needs to be the thing is that's that's
58:43 everyone knows there's a correlation
58:44 everyone knows that kids who spend a lot
58:47 of time too much time on your platforms
58:50 are at risk and it's not just the mental
58:53 health issues I mean let let me ask you
58:54 question question is your platform safe
58:56 for kids I believe it is but there's a
58:59 important difference between correlation
59:00 and causation you because we're not
59:02 going to be able to get anywhere we want
59:05 work in a productive open honest and
59:08 collaborative way with the private
59:10 sector to pass legislation that will
59:12 protect Americans that will protect
59:15 American children above all and that
59:17 will allow businesses to thrive in this
59:18 country if we don't start with an open
59:20 honest candid realistic assessment of
59:22 the issues we can't do that the first
59:24 point is you want kids to use the
59:26 platform more in fact you have an
59:28 obligation to but if you're not willing
59:30 to acknowledge it's a dangerous place
59:32 for children the internet is a dangerous
59:35 place for children not just your
59:36 platform isn't it isn't the internet a
59:37 dangerous place for children I think it
59:39 can be yeah there's both great things
59:41 that people can do and there are harms
59:43 that we need to work to yeah it's a
59:44 dangerous place for children there are
59:45 families here who have lost their
59:47 children there are families across the
59:48 country whose children have engaged in
59:50 self harm who have experienced low
59:52 self-esteem who have been sold deadly
59:54 pills on the internet the internet's a
59:56 dangerous place for children and your
59:58 platforms are dangerous places for
01:00:00 children do you agree I think that there
01:00:03 are harms that we need to work to
01:00:04 mitigate okay I I'm not gonna I think
01:00:07 overall why not why not just acknowledge
01:00:09 it why why do we have to do the the very
01:00:12 care I just I disagree with the
01:00:14 characterization that the internet's a
01:00:15 dangerous place for children um I I
01:00:18 think you're you're trying to
01:00:19 characterize our products as inherently
01:00:21 dangerous and I think that inherent or
01:00:23 not your your product are places where
01:00:25 children can experience harm they can
01:00:27 experience harm to their mental health
01:00:29 they can be sold drugs they can be
01:00:30 prayed upon by predators that you know
01:00:33 they're dangerous places and and and yet
01:00:37 you have an obligation to promote the
01:00:40 use of these platforms by children and
01:00:43 look all I'm all I'm trying to suggest
01:00:45 to you Mr Zuckerberg and my my time is
01:00:50 short is that in order for you to
01:00:52 succeed you and your colleagues here we
01:00:54 have to acknowledge these basic truths
01:00:56 we have to be able to come before the
01:00:58 American people the American public the
01:00:59 people in my state of Georgia and
01:01:01 acknowledge the internet is
01:01:03 dangerous including your platforms there
01:01:06 are predators lurking there are drugs
01:01:08 being sold there are harms to mental
01:01:09 health that are taking a huge toll on
01:01:13 kids quality of life and yet you have
01:01:16 this incentive not just you Mr
01:01:18 Zuckerberg all of you have an incentive
01:01:19 to boost maximize use utilization and
01:01:22 engagement and that is where public
01:01:24 public policy has to step in to make
01:01:27 sure that these platforms are safe for
01:01:29 kids so kids are not dying so kids are
01:01:31 not overdosing so kids are not cutting
01:01:33 themselves or killing themselves because
01:01:35 they're spending all day scrolling
01:01:38 instead of playing outside and I
01:01:39 appreciate all of you for your testimony
01:01:41 we will continue to engage as we develop
01:01:44 this legislation thank
01:01:48 you senator from
01:01:51 Tennessee thank you Mr chairman thank
01:01:53 you for to each of you for coming and I
01:01:58 know some of you had to be subpoena to
01:02:00 get here but we do appreciate that you
01:02:03 all are here Mr Chu I want to come to
01:02:04 you first uh we've heard that you're
01:02:07 looking at putting a headquarters in
01:02:10 Nashville and likewise in Silicon Valley
01:02:13 and Seattle and what you're going to
01:02:15 find probably is that the welcome mat is
01:02:17 not going to be rolled out for you in
01:02:20 Nashville like it would be in California
01:02:23 there are a lot of people in see that
01:02:24 are very concerned about the way Tik
01:02:28 Tock is basically building dossier on
01:02:31 our kids the way they are building those
01:02:35 on their virtual you and also that that
01:02:39 information is held in China in Beijing
01:02:42 as you responded to Senator Blumenthal
01:02:45 and I last year in reference to that
01:02:49 question and we also know that a major
01:02:51 music label yesterday said they were
01:02:54 pulling all of their content off your
01:02:57 site because of your issues on payment
01:03:01 on artificial intelligence and because
01:03:04 of the negative impact on our kids
01:03:07 mental health so we will see how that
01:03:11 progresses uh Mr Zuckerberg I want to
01:03:14 come to you uh we have just had Senator
01:03:18 Blumenthal and I of course have had some
01:03:20 internal documents and emails that have
01:03:23 come our way one of the things that
01:03:25 really concerned me is that you referred
01:03:28 to your young users in terms of their
01:03:32 lifetime value of being roughly
01:03:38 teenager and each of you should be
01:03:41 looking at these
01:03:44 kids their t-shirts they're wearing to
01:03:47 say today say I'm worth more than
01:03:53 270 dollars we've got some standing up
01:04:03 now and some of the children from our
01:04:06 state some of the children the parents
01:04:09 that we have worked with just to think
01:04:13 whether it is Becca Schmidt David mik
01:04:17 Sarah flat and Lee
01:04:21 sh would you say that Li is only worth
01:04:28 $270 what could possibly lead you I mean
01:04:32 I listen to that I know you're a dad I'm
01:04:36 grandmom and how could you possibly even
01:04:41 have that thought it is astounding to me
01:04:46 and I think this is one of the reasons
01:04:48 that um States 42 states are now suing
01:04:54 because of features that they consider
01:04:58 addictive that you are pushing forward
01:05:01 and in the emails that we've got from
01:05:04 2021 that go from August to
01:05:07 November there is the staff plan that is
01:05:10 being discussed and Antony Davis Nick C
01:05:13 Cheryl Sandberg Chris Cox Alex Schulz
01:05:16 Adam misseri are all on this chain of
01:05:19 emails on the well-being plan and then
01:05:22 we get to one Nick did email Mark for
01:05:26 emphasis to emphasize his support for
01:05:29 the package but it sounds like it lost
01:05:32 out to various other pressures and
01:05:35 priorities see this is what bothers
01:05:39 us children are not your priority
01:05:42 children are your
01:05:44 product children you see as a way to
01:05:51 money and children protecting children
01:05:54 in this virtual space you made a
01:05:59 decision even though Nick
01:06:03 clag and others were going through the
01:06:06 process of saying this is what we do the
01:06:10 these documents are really Illuminating
01:06:13 and it just shows me that growing this
01:06:19 business expanding your
01:06:23 Revenue what you were going to put on
01:06:25 those quarterly filings that was the
01:06:28 priority the children were not it's very
01:06:33 clear um I want to talk with you about
01:06:36 the pedophile ring because that came up
01:06:39 earlier and the Wall Street Journal
01:06:41 reported on that and one of the things
01:06:45 that we found out was after that became
01:06:47 evident then you didn't take that
01:06:51 content down and it was content that
01:06:53 showed that teens were for sale and were
01:06:56 offering themselves to older men and you
01:07:00 didn't take it down because it didn't
01:07:02 violate your community standards do you
01:07:05 know how often a child is bought or sold
01:07:09 country every two
01:07:11 minutes every two minutes a child is
01:07:16 bought or sold for sex that's not my
01:07:20 stat that is a TB I stat now finally
01:07:28 content was taken down after a
01:07:30 congressional staff or went to meta's
01:07:33 Global head of safety so would you
01:07:36 please explain to me and to all these
01:07:38 parents why explicit predatory content
01:07:42 does not violate your platform's terms
01:07:45 of service or your community
01:07:48 standards sheer Senator let me try to
01:07:50 address all the things that you just
01:07:51 said it does violate our standards we
01:07:54 work very hard to take it down didn't
01:07:56 take it down we've well we've reported I
01:07:59 think it's more than 26 million examples
01:08:01 of this kind of content didn't take it
01:08:03 down until a congressional staffer
01:08:05 brought it up it it may be that in this
01:08:07 case we made a mistake and missed
01:08:08 something make a lot of mistakes leading
01:08:11 teams that I want to talk with you about
01:08:14 your Instagram creators program and
01:08:17 about the push we found out through
01:08:19 these documents that you actually are
01:08:22 pushing forward because because you want
01:08:25 to bring kids in early you see these
01:08:29 younger tweenagers as valuable but an
01:08:32 untapped audience quoting from the
01:08:35 emails and suggesting teens are actually
01:08:38 household influencers to bring their
01:08:42 siblings into your platform into
01:08:45 Instagram now how can you ensure that
01:08:49 Instagram creators your product your
01:08:53 program does not facilitate illegal
01:08:56 activities when you fail to remove
01:09:00 content pertaining to the sale of Miners
01:09:05 and it is happening once every two
01:09:09 country um Senator our tools for
01:09:13 identifying that kind of content or
01:09:15 industry leading that doesn't mean we're
01:09:16 perfect there are definitely issues that
01:09:18 we have but we continue Zu yes there are
01:09:22 a lot that is slipping through it
01:09:24 appears that you're trying to be the
01:09:27 trafficking not Senator Senator that's
01:09:30 ridiculous it is not ridiculous you want
01:09:32 to turn around and want this content on
01:09:34 our platforms why don't you take it down
01:09:39 discussing all to work with us no you're
01:09:43 not you are not and the problem is we've
01:09:48 been working on this Senator Welch is
01:09:49 over there we've been working on this
01:09:51 stuff for a decade you have have an army
01:09:54 of lawyers and lobbyists that have
01:09:57 fought us on this every step of the way
01:10:00 you work with net Choice the KO
01:10:02 Institute taxpayers protection Alliance
01:10:04 and chamber of progress to actually
01:10:08 fight our bipartisan legislation to keep
01:10:12 kids safe online so are you going to
01:10:16 stop funding these groups are you going
01:10:18 to stop lobbying against this and come
01:10:21 to the table and work with us yes or no
01:10:23 Senator we have a yes or no of course
01:10:27 we'll work with you on on the
01:10:28 legislation the door is open we've got
01:10:31 all these bills you need you need to
01:10:34 come to the table each and every one of
01:10:36 you need to come to the table and you
01:10:39 need to work with us kids are
01:10:46 dying Senator Welch uh I want to thank
01:10:50 my colleague Senator Blackburn for her
01:10:52 decade of work on this
01:10:56 I actually have some
01:10:59 optimism there is a consensus today that
01:11:03 didn't exist say 10 years ago that there
01:11:06 is a profound threat to children to
01:11:10 mental health to safety there's not a
01:11:13 dispute that was in debate before that's
01:11:21 secondly we're identifying
01:11:24 concrete things that can be done in four
01:11:26 different areas one is industry
01:11:30 standards two is
01:11:33 legislation three is are the
01:11:36 courts and then four is a proposal that
01:11:39 Senator Bennett Senator
01:11:42 Graham myself and Senator Warren have to
01:11:45 establish an agency a governmental
01:11:47 agency whose responsibility would be to
01:11:52 engage in this on a systematic regular
01:11:55 basis with proper resources and I just
01:11:57 want to go through those I appreciate
01:12:00 the industry standard decisions and
01:12:03 steps that you've had you you've taken
01:12:06 companies but it's not enough uh and
01:12:09 that's what I think you're hearing from
01:12:10 my colleagues like for instance where
01:12:13 there are layoffs in it is in the
01:12:15 trusted uh the trust and verify programs
01:12:20 uh that's alarming because it looks like
01:12:22 there is a reduction in emphasis on
01:12:26 protecting things like you just added M
01:12:29 yino 100 employees in in Texas to in
01:12:32 this category uh and how many did you
01:12:36 before I the company is just coming
01:12:39 through a significant
01:12:41 restructuring so we've increased the
01:12:44 number of trust and safety employees and
01:12:46 agents all over the world by at least
01:12:48 10% so far in the last 14 months and we
01:12:51 will continue to do so specifically in
01:12:53 Austin Texas all right Mr Zuckerberg my
01:12:56 understanding is there have been layoffs
01:12:57 in that area as well there's added jobs
01:13:01 there at Twitter but uh it meta have
01:13:03 there been reductions in that there have
01:13:05 been across the board not really focused
01:13:07 on that area I think our our investment
01:13:09 is is relatively consistent over the
01:13:12 last couple of years we we invested
01:13:14 almost five billion dollars in this work
01:13:16 last year and I think this year will be
01:13:18 on the same order of magnitude all right
01:13:20 and another question that's come up is
01:13:22 when to the core of a user of any of
01:13:25 your platforms somebody has an image on
01:13:28 there that's very compromising often of
01:13:31 a sexual nature is there any reason in
01:13:33 the world why a person who wants to take
01:13:37 that down can't have a very simple same
01:13:40 day response to have it taken
01:13:44 down I'll start with Twitter oh
01:13:48 now I'm sorry Senator I was taking notes
01:13:51 could you repeat the question well it
01:13:53 there's a lot of examples of a young
01:13:56 person finding out about an image that
01:14:00 is of them and really compromises them
01:14:02 and actually can create suicidal
01:14:05 thoughts and they want to call up or
01:14:07 they want to send an email and say take
01:14:09 it down I mean why is it not possible
01:14:12 for that to be responded to immediately
01:14:15 well we all strive to take down any type
01:14:18 of uh violative content or disturbing
01:14:21 content immediately at we have increased
01:14:25 our capabilities with a twostep
01:14:27 reporting process if I'm a parent or I'm
01:14:29 a kid and I want this down shouldn't
01:14:32 there be methods in place where it comes
01:14:36 down you can see what the image is yes a
01:14:40 an ecos systemwide standard would
01:14:43 improve and actually enhance the
01:14:45 experience for users at all our
01:14:47 platforms all right there there actually
01:14:49 is an organization I think a number of
01:14:50 the companies up here are a part of
01:14:52 called take it down it's um some
01:14:54 technology that we and and a few others
01:14:57 you all are you all are in favor of that
01:14:58 because that is going to give some peace
01:15:00 of mind to people all right it really
01:15:02 really matters uh I don't have that much
01:15:05 time so we've talked about the
01:15:07 legislation and uh Senator uh White
01:15:11 House had asked you to get back with
01:15:13 your position on Section 230 which I'll
01:15:15 go to in a minute but I would welcome
01:15:18 each of you responding uh as to your
01:15:21 company's position on the bills that are
01:15:24 under consideration in this hearing all
01:15:27 right I'm just asking you to do that a
01:15:30 third the court this big question of
01:15:34 230 and today uh I'm pretty inspired by
01:15:38 the presence of the parents who have
01:15:41 turned their extraordinary grief into
01:15:44 action and hope that other parents may
01:15:46 not have to suffer what for them is a
01:15:49 devastating for everyone a devastating
01:15:51 loss senator White House asked you all
01:15:54 to get back very concretely about
01:15:56 section 230 and your position on that
01:16:01 astonishing benefit that your industry
01:16:04 has that no other industry has they just
01:16:08 don't have to worry about being held
01:16:11 accountable in court if they're
01:16:13 negligent so you've got some explaining
01:16:16 to do and I'm just reinforcing Senator
01:16:21 White House's request that you get back
01:16:23 specifically about that and then finally
01:16:26 I want to ask about this notion this
01:16:28 idea of a of a of a u Federal agency
01:16:32 whose resourced and whose job is to be
01:16:37 dealing with public interest matters
01:16:39 that are really affected by big Tech
01:16:42 it's extraordinary what has happened in
01:16:44 our economy uh with technology and your
01:16:47 companies represent Innovation and
01:16:50 success uh but just as when the
01:16:53 railroads were ascendant and were in
01:16:55 charge and ripping off Farmers because
01:16:58 of practices they were able to get away
01:17:00 with just as when Wall Street was flying
01:17:02 high but there was no one regulating
01:17:04 Blue Sky laws uh we now have a whole new
01:17:07 world in the economy and Mr Zuckerberg I
01:17:09 remember uh you testifying in the Energy
01:17:12 and Commerce Committee and I asked you
01:17:13 your position on the uh concept of a
01:17:17 federal regulatory agency my
01:17:19 recollection is that you were positive
01:17:21 about that is that still
01:17:25 um I I think it it could be a a
01:17:28 reasonable solution there are obviously
01:17:30 pros and cons to doing that versus
01:17:31 through the normal the the the current
01:17:33 structure of having different Regulatory
01:17:35 Agencies focused on specific issues but
01:17:37 because a lot of the things trade off
01:17:39 against each other like one of the
01:17:40 topics that we talked about today is
01:17:41 encryption and that's obviously really
01:17:43 important for privacy and security but
01:17:46 can we just go down the line I'm at the
01:17:47 end but thank you m yino Senator I think
01:17:50 the uh industry initiative to keep those
01:17:53 conversations going would be something X
01:17:56 would be very very proactive about if
01:17:58 you think about our support of the
01:17:59 report act the shield act the stop cesam
01:18:02 act our support of the Project SAFE
01:18:04 childhood act I think our intentions are
01:18:06 clear to participate and to here yeah
01:18:10 Senator um we support National privacy
01:18:12 legislation for example so that sounds
01:18:14 like a good idea we just need to
01:18:15 understand what it means all right uh Mr
01:18:18 Spiel Senator we'll continue to work
01:18:20 with your team and we'd certainly be
01:18:21 open to exploring the right regul at
01:18:23 body for big technology but the idea of
01:18:25 a regulatory body is something that you
01:18:28 can see has Merit yes sen and Mr Cen
01:18:34 yeah we're very open to to working with
01:18:36 with you and our peers and anybody on
01:18:38 helping make the internet a safer place
01:18:40 you know I think you mentioned this is
01:18:42 not a one platform problem right so we
01:18:44 we do look to collaborate with other
01:18:45 companies and with nonprofits in the
01:18:47 government thank you Mr chairman I yield
01:18:50 back thank you Senator Welch well we're
01:18:53 going to conclude this hearing and thank
01:18:54 you all for coming today you probably
01:18:56 have your scorecard out there you've met
01:18:59 at least 20 members of this committee
01:19:00 and have your own impressions of their
01:19:02 questioning and approach and the like
01:19:04 but the one thing I want to make clear
01:19:06 as chairman of this committee for the
01:19:08 last three years is this was an
01:19:11 extraordinary vote on an extraordinary
01:19:13 issue a year ago we passed five bills
01:19:17 unanimously in this committee you heard
01:19:19 all the Senators every spot on the
01:19:21 political Spectrum was
01:19:23 covered every single Senator voted
01:19:26 unanimously in favor of the five pieces
01:19:28 of legislation we've discussed today it
01:19:31 ought to tell everyone who follows
01:19:32 Capitol Hill in Washington a pretty
01:19:35 Stark message we get it and we live it
01:19:39 as parents and grandparents we know what
01:19:42 our daughters and sons and others are
01:19:44 going through they cannot cope they
01:19:48 cannot handle this issue on their own
01:19:50 they're counting on us as much as they
01:19:52 are counting on the industry to do the
01:19:54 responsible thing and some will leave
01:19:57 with impressions of our Witnesses and
01:19:59 the companies they represent that you're
01:20:00 right as an American citizen but you
01:20:03 ought to also leave with the
01:20:04 determination to keep the spotlight on
01:20:06 us to do something not just to hold a
01:20:09 hearing bring out a good strong crowd of
01:20:13 supporters for change but to get
01:20:15 something done no excuses no excuses
01:20:19 we've got to bring this to a vote what I
01:20:21 found in my time in the house house and
01:20:23 the Senate is that's the day that's the
01:20:25 moment of Reckoning speeches
01:20:27 notwithstanding press releases and the
01:20:29 like the moment of Reckoning is when we
01:20:31 call a vote on these measures it's time
01:20:33 to do that I don't believe there's ever
01:20:35 been a moment in America's wonderful
01:20:37 history when a business or industry has
01:20:39 stepped up and said regulate us put some
01:20:42 legal limits on us businesses Exist by
01:20:45 and large to be profitable and I think
01:20:47 that we got to get behind that and say
01:20:49 profitability at what cost Senator
01:20:52 Kennedy Republican colleague said is our
01:20:55 technology greater than our Humanity I I
01:20:58 think that is a fundamental question
01:21:00 that he asked what I would add to it or
01:21:02 politics greater than
01:21:05 technology we're going to find out I
01:21:08 want to thank a few people before we
01:21:09 close up here I've got several staffers
01:21:12 who worked so hard on this Alexander
01:21:14 galber thank you very much Alexander
01:21:16 Jeff Hansen Scott
01:21:23 last point I'll make Mr Zuckerberg is is
01:21:25 just a little advice to you I think your
01:21:28 opening statement on Mental Health needs
01:21:30 to be explained because I don't think it
01:21:33 makes any sense there is an parent in
01:21:35 this room who's had a child that's gone
01:21:37 through an emotional experience like
01:21:39 this that wouldn't tell you and me they
01:21:42 changed right in front of my eyes they
01:21:44 changed they hold themselves up in their
01:21:46 room they don't longer reached out to
01:21:48 their friends they lost all interest in
01:21:49 school these are mental health
01:21:51 consequences that I think come with the
01:21:53 abuse of this right to have access to
01:21:56 this kind of technology so uh I will
01:21:59 just I see my colleague you want to say
01:22:01 a word uh I think it was a good hearing
01:22:03 I hope something positive comes from it
01:22:05 thank you all for coming the hearing
01:22:07 record is going to remain open for a
01:22:08 week for statements and questions may be
01:22:10 sub uh submitted by Senators by 5:00 P
01:22:13 pm on Wednesday once again thanks to the
01:22:15 witnesses for coming the hearing stands
01:22:23 just keep a path here
01:22:26 folks good thank
01:22:39 here just watch your step