YTread Logo
YTread Logo

Big Tech CEOs Testify About Protecting Children Online

Mar 13, 2024
They can find me and contact me when I was 17 years old. I had to write a victim impact statement after being extorted for four consecutive years while I was strong enough to resist sending him more photos. There were dozens more that we didn't receive. a phone call to find out my son was in his room and was suicidal. He was only 13 years old at the time. He and a friend had been exploited

online

and trafficked and my son reached out on Twitter or now X's response was thank you. you for reaching out we reviewed the content and found no violation of our policies so no action will be taken at this time how many more

children

like Matthew like Olivia like Riley how many more

children

will suffer and die due to social media big

tech

failed to protect my child from sexual exploitation, big

tech

failed to protect me from

online

sexual exploitation and we need Congress to do something for our children and protect them, it's not too late, it's not too late to do something about.
big tech ceos testify about protecting children online
It online child sexual exploitation is a crisis in the United States in 2013, the National Center for Missing and Exploited Children known as nickm received approximately 1,380 cyber reports per day by 2023, just 10 years later, the number of cyber reports increased to 100,000 complaints per day, which is 100,000 daily reports of child sexual abuse material also known as cesam. In recent years we have also seen an explosion in so-called financial sexual twisting in which a predator uses a fake social media account to trick a minor into sending explicit photos or videos. then threatens to release them unless the victim sends money in 2021 Nick Mech received a total of 13 39 reports of sexual torture in 2021 alone in 2023 until the end of October, this number skyrocketed to more than 22,000 more than a dozen children have died by suicide after becoming victims of this crime, this worrying growth in child sexual exploitation is driven by one thing: changes in technology In 1996, the world's best-selling cell phone was the Motorola startech engine, while the innovative At the time, the clamshell-style cell phone was not much different from a traditional phone that allowed users to make and receive calls and even receive text messages, but that was it, fast forward to today, smartphones.
big tech ceos testify about protecting children online

More Interesting Facts About,

big tech ceos testify about protecting children online...

They are in the pockets of seemingly every man, woman and teenager on the planet, like today's star Tac. Today's smartphones allow users to make and receive calls and text messages, but they can also take photos and videos, support live streaming and offer countless applications at the touch of your finger, that smartphone that can entertain and inform you can become in a dead end where the lives of their children are damaged and destroyed. Apps have changed the way we live, work and play, but as research has detailed, social media and messaging apps have also given Predators powerful new tools to sexually exploit children.
big tech ceos testify about protecting children online
Its carefully designed algorithm may be a more powerful force in the lives of our children than even most. The most well-meaning parent Discord has been used to kidnap and abuse children. Instagram helped connect and promote a network of pedophiles. Snapchats. Disappearing messages have been co-opted by criminals who financially extort young victims. Tik Tock has become a preferred listing platform for predators. access involve and groom children for abuse and the prevalence of the camera in X has increased as the company has destroyed their trust and safety. Workforce today we will hear from the CEOs of those companies;
big tech ceos testify about protecting children online
It is not just technology companies that have contributed to this crisis. are responsible for many of the dangers our children face online, their design choices, their failure to adequately invest in trust and safety, their constant pursuit of compromise and profit over basic safety, have put our children at risk. and grandchildren. Coincidentally, several of these companies implemented Common Sense's child safety improvements in the final days of the week before their CEOs had to justify their inaction to this committee, but the tech industry alone is not guilty of the situation we find ourselves in, those of us in Congress need to look in the mirror at 1996, the same year Motorola launched the attack, it was flying off the shelves and years before social media became widespread, we passed the Section 230 of the Communications Decency Act.
This law immunized the then nascent Internet platforms from liability for user-generated content that was of interest to only one other industry in the United States has immunity from civil liability, we will leave that for another day for the last 30 years, Article 230 has remained largely unchanged, allowing Big Tech to grow into the most profitable industry in the history of capitalism without fear of liability for unsafe practices that have To change over the past year, this committee unanimously reported five bills that would finally hold technology companies responsible for child sexual exploitation on their platforms. Take a unanimous look at the composition and members of the Senate Judiciary Committee and imagine, if you will, that there is something we can do. unanimously agreeing to these five bills were the subject of the agreement one of these bills is to stop C Sam acting critically would allow victims to sue online providers who promote or aid and abet online child sexual exploitation or who harbor or restore cesam this stance online Child sexual exploitation is bipartisan and absolutely necessary.
Let this hearing be the call to action we need to get child online safety legislation to the president's desk. I now turn to the ranking member, Senator Graham, thank you, Mr. Chairman, the Republicans will answer the call. all of us, every one of us, are ready to work with you and our Democratic colleagues on this committee to show the American people that, although Washington is certainly broken, there is a ray of hope and this is where it lies with its children after years of work. In working on this issue with you and others, I have concluded that the following social media companies, as currently designed and operated, are dangerous products that are destroying lives and threatening democracy itself.
These companies must be controlled or the worst is yet to come. Gavin Guffy is a Republican representative from South Carolina in the Rock Hill area to all the victims who came and showed us pictures of their loved ones, don't give up, it's working, they're making a difference through you, we'll get to where We have to go so other people don't have to show a photo of their family. The damage to your family has already been done. Hopefully we can take your pain and turn it into something positive so no one else has to hold up a sign that Gavin received. online with Instagram and he was tricked by a group in Nigeria who put a young girl posing as his girlfriend and how things go at that stage of life he gave her some photos uh compromising sexual photos and it turned out that she was part of a extortion group in Nigeria threatened the young man that if he doesn't give us money we will expose these photos he gave them money but it was not enough they kept threatening and he committed suicide they threatened Mr.
Guffy and a son these are bastards by any known definition, eh, Mr. Zuckerberg, you and the companies that came before us, I know it's not your intention, but you have blood on your hands, you have a product, you have a product that is killing people when cigarettes were killing. people, we did something about it maybe not enough, they are going to talk about guns, we have the ATF, there is nothing here, nobody can do anything about it, they can't be sued now, Senator Blumenthal and Blackburn, who have been like The Dynamic Doo here found emails from your company where they warned you about these things and you decided not to hire 45 people who could do a better job keeping an eye on this, so the bottom line is they can't sue you, they should and these emails Electronics would be great for punitive damages, but the courts shut down all the Americans abused by all the companies in front of me, all the people in America, we could provide general liability protection to this would be the last group I would choose, now It's time to repeal. section 230, this committee is made up of the most ideologically diverse people you can find, we have come together thanks to your leadership, Mr.
Speaker, to pass five bills to address the problem of child exploitation. I'll talk about them in depth in a moment. But the bottom line is that all these bills have suffered the same fate, they go nowhere, they leave the committee and die now there is another approach, what do you do with dangerous products? Or do you allow lawsuits, do you have legal protections for protect consumers or have a commission. of all kinds to regulate the industry in question, take away your license if you have a license for yourself, none of that exists here, we live in an America in 2024 where there is no regulatory body that deals with the largest and most profitable companies in the story of the In the world you can't be sued and there isn't a single law in the book that is significant in

protecting

the American consumer other than we are in a good situation, so this is what I think is going to happen.
I think after this hearing today we're going to put a lot of pressure on our fellow Democratic Republican Senate leaders to allow these bills to come to the floor and be voted on and I'm going to come down in a couple of weeks to make a request unanimous uh request for unanimous consent to cease the act of winning it do your Bill do all the math and you can be famous come and object I'm going to give you the chance to be famous now Elizabeth Warren and Lindsey Graham have almost nothing in common Le I promised I would say it publicly The only thing worse than me doing a bill with Elizabeth Warren is her doing a bill with me.
In some ways we have parted ways because Elizabeth and I see an abuse here that needs to be addressed with Senator Durán and I have different political philosophies, but I appreciate it. What he has done on this committee has been a great partner to all of my Democratic colleagues. Many thanks to my Republican colleagues. Thank you very much to all. Save the applause for when we get a result. All this is being talked about right now. But they will come one day if we keep pushing for the right answer for the American people. What is that answer?
Responsibility now these products have an advantage. You have enriched our lives in many ways. Mr. BG, you created a product. I use it. I think the idea when you first came up with it was to be able to talk to your friends and your family and broadcast your life so you could have a place where you could talk to your friends and family about the good things that happen in life and I use Yes We all use it, everything here has its advantages, but the dark side has not been addressed. Now is the time to address the dark side because people have taken your idea and turned it into a nightmare for the American people.
We have turned it into a nightmare for the world at large Tick Tock, we had a big discussion about how maybe Oracle's Larry Ellison can protect American data from Chinese communist influence, but Tick Tock, their representative Israel, left the company because Tik Tock is being used in a way to basically destroy the Jewish State, this is not just about individuals. I am concerned that in 2024 our democracy will be attacked again through these platforms by foreign actors that we are exposed to and AI is just getting started so to my colleagues we are here for a reason this committee has a track record to be tough but also to do things that need to be done, this committee has risen to the occasion, there is more we can do, but to the members of this committee let us insist that our colleagues rise to the occasion.
Let's also do I'm sure we have votes in the 118th Congress that would fix this problem. All you can do is cast your vote at the end of the day, but you can prompt the system to require others to cast their vote. Mr. Chairman, I will continue to work with you and everyone on this committee will have a day of reckoning on the floor of the United States Senate, thank you, thank you, Senator Graham, today we welcome five witnesses to whom I will now introduce Jason Citron , Discord Incorporated CEO Mark Zuckerberg, the founder and CEO. of meta Evan Spiegel, co-founder and CEO of snap Incorporated, shows Chu, the CEO of Tik Tock, and Linda yako, CEO of X Corporation, formerly known asTwitter.
I will note for the record that Mr. Zuckerberg and Mr. Chu appear voluntarily. disappointed that our other Witnesses did not offer the same degree of cooperation Mr. Citron Mr. Spiegel and Ms. Jarino are here pursuant to subpoenas and Mr. Citron only accepted service of his subpoena after the US Marshals were sent to Discord HQ at taxpayer expense. I hope this is not a sign of your commitment or lack of commitment to addressing the serious issue before us after I was sworn in, witness says each witness will have five minutes to make an opening statement, then senators will ask questions In an opening round of seven minutes each, I hope to take a short break at some point during questioning to allow witnesses to stretch their legs.
If anyone needs a break at any time, please let my staff know before addressing witnesses. I would also like to take a moment to recognize that this hearing has attracted a lot of attention, as we were hoping that we would have a large audience, the largest I have seen in this room today. I want to be clear that, like other Judiciary Committee hearings, we ask people to behave appropriately. I know there is a lot of emotion in this room for justifiable reasons, but I ask that you follow the traditions of the committee, that is, not standing around shouting, singing or clapping.
Interruptions from Witnesses will not be tolerated. Anyone who disrupts the hearing will be asked to leave. The Witnesses are here today. To address a serious topic we want to hear what they have to say. I thank you for your cooperation. Could all witnesses take an oath? Do you claim that the testimony you are about to give before the committee will be the truth? the whole truth and nothing but the truth, so help you God, let the record reflect that all the witnesses have answered affirmatively Mr. Citron, continue with the opening speech of him. Good morning, my name is Jason Citron and I am the co-founder and CEO of Discord.
We are an American company with around 800 employees living and working in 33 states today. Discord has grown. for over 150 million monthly active users Discord is a communication platform where friends hang out and talk online about shared interests, from fantasy sports to writing music to video games. I have been playing video games since I was 5 years old and as a kid it is how I had fun and found friendships, many of my best memories are of playing video games with friends, we created Discord so anyone can make friends playing video games from Minecraft to Wordle and everything in between.
There between games has always brought us together and Discord makes that happen. today Discord is one of the many services that have revolutionized the way we communicate with each other at different moments in Our Lives iMessage zoom Gmail and go on and on enrich our lives create communities accelerate commerce, healthcare and education, such as It happens with all technology and there are people who exploit and abuse our platforms for immoral and illegal purposes. All of us here today on the panel and across the tech industry have a solemn and urgent responsibility to ensure that everyone who uses our platforms is protected from these criminals both online and online. off Discord has a special responsibility to do that because many of our users are young, over 60% of our active users are between 13 and 24 years old, that's why security is built into everything we do, it's essential to our mission and our business and, above all, this is deeply personal.
I am a father of two children. I want Discord to be a product that they use and love and I want them to be safe on Discord. I want you to be proud of me for helping bring this product to the world which is why I am pleased to be here today to discuss the important topic of online miner security. My written testimonial provides a comprehensive overview of our security programs. Here are some examples of how we protect and empower young people first. We've invested our money in security, the tech industry has a reputation for larger companies buying smaller ones to increase user numbers and improve financial results, but the biggest acquisition we've made at Discord was a company called centropy , it didn't help us. expand our market share or improve our results, in fact, because it uses artificial intelligence to help us identify bans and report criminals and bad behavior, it has actually reduced our number of users by getting rid of bad actors.
Secondly, you have heard of the final encryption that blocks. anyone, including the platform itself, see users Communications is a feature on dozens of platforms, but not on Discord, that's a choice we've made, we don't believe we can meet our security obligations if users' text messages Teens are completely encrypted because encryption blocks our ability to investigate a serious situation and, when appropriate, report it to law enforcement authorities. We have a zero tolerance policy on child sexual abuse material or cesam. We scan images uploaded to Discord to detect and block the sharing of this abortive material that we have also created. an innovative teen safety assistance tool that blocks explicit images and helps teens easily report unwanted conversations.
We have also developed a new semantic hashing technology to detect new forms of cesam called clip and are sharing this technology with other platforms through the Tech Coalition. Finally, we recognize that improving online safety requires us all to work together, which is why we partner with nonprofit law enforcement organizations and our technology colleagues to stay at the forefront of

protecting

youth online. We want to be the platform that empowers our users to have better opportunities. online experiences to build true connections, genuine friendships and have fun Senators. I sincerely hope that today is the beginning of a continued dialogue that results in real improvements in online safety.
I look forward to your questions and helping the committee learn more about Discord, thank you Mr. Sitron Mr. Zuckerberg, Durban Chair, Ranking Member Graham, and Committee Members, Every day, teens and young adults do amazing things in our services, use our apps to create new things, express themselves, explore the world around them, and feel more connected to the people they care about. Overall, teens tell us this is a positive part of their lives, but some face challenges online, so we work hard to provide parents and teens with support and check-ins to reduce potential harms. Being a parent is one of the most difficult jobs in the world. ways to communicate with our children and feel connected to their lives, but it can also make parenting more complicated and it is important to me that our services are positive for everyone who uses them.
We stand with parents around the world who work hard to raise their children. Over the past eight years we've created over 30 different tools, resources and features so parents can set time limits for their teens using our apps, see who they follow, or report someone for teen bullying. We have received pushes. that remind them when they've been using Instagram for a while or if it's getting late and they need to go to sleep, as well as ways to hide words or people without those people knowing. We put special restrictions on teen accounts on Instagram by default. Those under 16 are set to private, have the most restrictive content settings, and can't receive messages from adults they don't follow or people they're not connected to, since so much of our lives are spent on devices. mobile phones and social networks, it is important. to analyze the effects on the mental health and well-being of adolescents.
I take this very seriously that mental health is a complex issue and the body of existing scientific work has not demonstrated a cause that establishes a link between the use of social networks and young people having worse mental health outcomes in a study recent. The national science report evaluated more than 300 studies and found that the research citation does not support the conclusion that social media causes population-level changes and adolescent mental health. The final quote also suggested that social media can provide significant positive benefits when young people use it. to express yourself, explore and connect with others, yet we will continue to monitor the research and use it to inform our roadmap.
Keeping young people safe online has been a challenge since the internet began and as criminals evolve their tactics, we need to evolve. Our defenses also work closely with authorities to find bad actors and help bring them to justice, but the difficult reality is that no matter how much we invest or how effective our tools are, there is always more, there is always more to learn and more improvements to but we remain ready to work with members of this industry committee and parents to make the Internet safer for everyone. I am proud of the work our teams do to improve child safety online on our services and across the internet we have around 40,000 people overall working in safety and security and we have invested over 2 billion in this since 2016, including around 5 billion in the last year alone, we have many DED teams dedicated to child safety and teen well-being and we lead the industry in many of the areas we are discussing today, we develop technology to address the worst risks online and we share it to help our entire industry improve, like the Lantern project, which helps companies share data about people who break child safety rules and we are founding members of Take it.
By creating a platform that helps young people prevent their nude images from being spread online, we also go beyond legal requirements and use sophisticated technology to proactively uncover abusive material and, as a result, find and report more content inappropriate than anyone else in the industry nationally. The Center for Missing and Exploited Children said as much this week. The meta goes further to ensure that there are no parts of your network where this type of activity occurs and I quote. I hope that today we can have a substantive discussion that drives improvements across the industry. including legislation that delivers what parents say they want: a clear system for age verification and control over the apps their children use.
Three in four parents want age verification in the App Store and four in five want parental approval of anything every time teens download apps. We support this parents should have the final say on which apps are appropriate for their children and should not have to upload their ID every time, that's what app stores are for. We also support setting industry standards on age-appropriate content and limiting advertising signals aimed at teens. to age and location and not behavior here, at the end of the day, we want everyone who uses our services to have safe and positive experiences before they call it quits.
I want to recognize the families who are here today, who have lost a loved one or experienced some terrible things that no family should have to endure. These issues are important to all parents and all platforms. I am committed to continuing to work in these areas and hope that we can move forward today. Thank you, Mr Speaker, Ranking Member for Durban. Graham and the committee members thank you for convening this hearing and for pushing for important legislation to protect children online. I'm Evan Spiegel, co-founder and CEO of snap. We created Snapchat, an online service used by more than 800 million people. all over the world to communicate with your friends and family.
I know many of you have been working to protect children online since before Snapchat was created, and we are grateful for your long-term dedication to this cause and your willingness to work together to help sustain our community. sure I want to recognize the survivors of online harm and the families who are here today who have suffered the loss of a loved one. Words cannot begin to express the deep pain. I feel like a service we designed to bring happiness and joy to people has been abused. cause harm I want to make it clear that we understand our responsibility to keep our people safe.community.
I also want to recognize the many families who have worked to raise awareness about these issues, pushed for change, and collaborated with legislators on important legislation like the Cooper Davis Act. that can help save lives I started creating Snapchat with my co-founder Bobby Murphy when he was 20 years old. We designed Snapchat to solve some of the problems we experienced online as teenagers. We didn't have an alternative to social media that Images shared online were permanently public and subject to popularity metrics, didn't seem very good to us. We built Snapchat differently because we wanted a new way to communicate with our friends that was fast, fun, and private.
A picture is worth a thousand words for people to communicate. images and videos on Snapchat we don't have public likes or comments when you share your story with friends. Snapchat is private by default, meaning people must choose to add friends and choose who can contact them when we create the Snapchat we choose to have. Images and videos sent through our service are deleted by default, unlike previous Generations who enjoyed the Privacy provided by phone calls that are not recorded. Our Generation has benefited from the ability to share moments through Snapchat that may not be picture perfect but instead convey emotions without permanence.
Although Snapchat messages are deleted by default, we inform everyone that the recipient may save images and videos when we take action on illegal or potentially harmful content. We also retain evidence for a long period of time. allowing us to support law enforcement and hold criminals accountable to help prevent the spread of harmful content on Snapchat We approve recommended content on our service through a combination of automated processes and human review We apply our content rules consistently and fair in all accounts we run samples of our compliance actions through quality assurance to verify that we are doing it right, we also proactively scan for known child sexual abuse material, drug-rated content and other types of harmful content we remove that content, we deactivate and block the device, offending accounts we preserve evidence for law enforcement and report certain content to the relevant authorities for further action last year we made 690,000 reports to the National Center for Missing and Exploited Children, which led to more than 1,000 arrests.
We also removed 2.2 million content classified as drugs and blocked 75,000 associated accounts. Even with our strict privacy settings, content moderation efforts, proactive detection, and collaboration with law enforcement, bad things can still happen when people use online services. This is why we believe that people under the age of 13 are not ready to communicate on Snapchat. We highly recommend parents use the device level parental controls on iPhone and Android we use them in our own home and my wife approves of every app our 13 year old son downloads for parents who want more visibility and control we created Family Center and Snapchat where you can see who your teen is talking to review privacy settings and set content limits.
We have worked for years with committee members on legislation like the Child Online Safety Act and the Cooper Davis Act, which we are proud to support. I want to encourage broader industry support for the legislation. Protecting children online No legislation is perfect, but some rules of the road are better than none. Much of the work we do to protect the people who use our service would not be possible without the support of our partners across the industry, nonprofit government organizations, and particularly law enforcement and first responders who have committed their lives to help keep people safe.
I am deeply grateful for the extraordinary efforts across our country and around the world to prevent criminals from using online services to commit their crimes. I feel an overwhelming sense of gratitude for the opportunities this country has given me and my family. I feel a deep obligation to give back and make a positive difference and I am grateful to be here today. As part of this vitally important Democratic process, committee members, I promise you that we will be part of the solution for online safety, be honest about our shortcomings, and continually work to improve. Thank you and I look forward to it. responding to your questions, thank you, Mr.
Speaker, Mr. Chu, Durban Chairperson, Ranking Member for Durban, Graham and members of the committee. I appreciate the opportunity to appear before you today. My name is show Chu and I am the CEO of Tik Tok, a billion-plus online community. people around the world, including more than 170 million Americans who use apps every month to create, share and discover. Although the average age on Tik Tok in the US is over 30, we recognize that special safeguards are required to protect minors and especially when it comes to fighting all forms of cessation as a parent of three young children. . I know the topics we are discussing today are horrible and every parent's nightmare.
I am proud of our efforts to address threats to young people online from a commitment to protecting them with our industry-leading policies, use of innovative technology and significant continued investments in trust and safety to achieve this goal Tik Tok is vigilant in enforcing its 13+ policy and offers a teen experience that is much more restrictive than you and As adults, we would need to make careful product design decisions to help make our app inhospitable to those seeking to harm teens. Let me give you some examples of long-standing policies that are unique to Tik Tok. We didn't apply them last week.
First. Direct messaging is not available for users under 16 years of age. Second accounts for people under 16 are automatically set to private along with their content. Additionally, the content is not downloadable and will not be recommended to people you don't know. A third of every minor adolescent. 18 has a screen time limit automatically set to 60 minutes and fourthly, only people over 18 can use our live streaming feature. I'm proud to say that Tik Tok was one of the first to allow parents to supervise their teens on our app. With our family matching tools, this includes setting screen time limits, filtering content from teen feeds, and more.
We make these decisions after consulting with doctors and safety experts who understand the unique stages of development. of adolescents to ensure we have adequate safeguards to prevent harm. and minimize risk now security is one of the core priorities that define Tik Tok under my leadership, we currently have more than 40,000 trust and security professionals working to protect our community globally and we expect to invest more than $2 billion in trust and safety efforts this year. With only a significant portion of that in our US operations, our strong community guidelines strictly prohibit content or behavior that puts teens at risk of exploitation or other harm and we vigorously enforce them.
Our technology moderates all content uploaded to our app to help quickly identify potential CSAMs. and other material that violates our rules, automatically removes the content or elevates it to our security. Professionals for further review. We also moderate direct messages for cesam and related material and use third-party tools such as Photo DNA and remove it from combat cesam to prevent content from being uploaded to our platform, we continually meet with parents, teachers and teenagers. In fact, I sat down with a group just a few days ago. We use their insights to strengthen protections on our platform and also work with leading groups like the tech coalition The Steps.
The steps we are taking to protect teens are a critical part of our broader trust and safety work as we continue our voluntary and unprecedented efforts to build a safe data environment for us, the users, ensuring our platform remains free. from external manipulation and implementing safeguards. In our content moderation and recommendation tools, keeping teens safe online requires a collaborative effort and collective action. We share the Committee's concern and commitment to protecting young people online and welcome the opportunity to work with you on legislation to achieve this goal. Our commitment is continuous and unwavering because there is no finish line when it comes to protecting teams.
Thank you for your time and consideration today. I am happy to answer your questions. Thank you Mr je yino, Chairperson Durban, Ranking Member Graham, and esteemed members of the committee. Thank you for the opportunity to discuss Thank you very much, please adjust my chair, apologies, start again, Chairperson, Ranking Member for Durban, Graham and esteemed members. of the committee thank you for the opportunity to discuss X's work to protect the safety of online miners. Today's hearing is titled a crisis that demands immediate action as a mother. This is personal and I share the sense of urgency. X is a completely new company. indispensable platform for the world and for democracy, you have my personal commitment that US to harness the power. of the means to protect people before I join.
I was amazed at the leadership steps this new company was taking to protect children. X is not the preferred platform for children and teenagers. We do not have a dedicated line of business, children under 13 cannot open an account. Less than 1% of US users on to minors our policy is clear CSSE, we have also strengthened our law enforcement with more tools and technology to prevent bad actors from distributing, searching and engaging with CSSE content. If CSE content is posted on , this represents an increase from the 2.3 million accounts that were removed by Twitter in 2022. In 2023, 850,000 reports were sent to necmi, including the first automatically generated. report this is eight times what Twitter reported in 2022 we have changed our priorities we have restructured our trust and safety teams to remain strong and agile we are building a trust and safety center of excellence in Austin Texas to bring more agents in-house to accelerate our impact we are applying to the technology coalition's Lantern project to achieve greater progress and impact across the industry.
We have also opened up our algorithms for greater transparency. We want the United States to lead this solution. X praises the Senate for approving the report. and we support the shield law, it is time for a federal rule to criminalize the exchange of non-consensual intimate material. We need to raise standards across the entire Internet ecosystem, especially for those technology companies that are not here today and do not step up support for commit to it and guarantee the protection of freedom of expression. There are two additional areas that require everyone's attention first as a police officer's daughter law.
Law enforcement must have the critical resources to bring these bad criminals to justice. Second, the tactics of criminals with artificial intelligenceThey will continue to become more sophisticated and evolve. Industry collaboration is imperative here. X believes that freedom of expression and platform security can and should coexist. We agree that now is the time. it is time to act urgently thank you I hope to answer your questions thank you very much miss jarino now we will enter into rounds of questions of seven minutes each for the members also uh I would like to take note of your testimony Michelle Gino I believe that you are the first company of social networks to publicly support camak.
He is our honorary president, that is progress. My friends, thank you for doing that. I'm still going to ask some probing questions, but let me get to the bottom of it. I'm going to focus on my legislation on cessation of what it says is civil liability if you intentionally or knowingly host or store child sexual abuse materials or make available child sexual abuse materials secondly, promote or aid and abet to a violation of child sexual exploitation intentionally or knowingly. laws, is there anyone here who believes that you should not be held civilly liable for that type of conduct?
Mr. Citron, good morning, President, you know, we strongly believe that this content is disgusting and there are many things about the bill to stop CM that I think are very encouraging and we are very supportive of adding more resources to the cyber tip line and modernize it along with giving more resources to nek Mech um and we are very open to having conversations with you and your team to talk a little more about the details of the invoice. I'm sure I'd like to do that because if you intentionally or knowingly host or store a camera, I think you should at least be held civilly liable.
I can't imagine anyone disagreeing with that, yeah. It's disgusting content, that's certainly why we need you to support this legislation, Mr. Spiegel. I want to tell you that I listened carefully to your testimony here and it has never been a secret that Snapchat was used to send sexually explicit images in 2013, early in your company's history. you admitted this in an interview interview, do you remember that interview, senator? I don't remember the specific interview, you said that when you were first trying to get people on the app, you quoted, you approached people and said, hey, you should try this.
In this app you can send photos that disappear and they would be like oh for sexting, remember that interview with the Senator when we first created the app? It was actually called pikaboo and the idea was to make the pictures disappear, the feedback we got from people using the app is that they were actually using it to communicate, so we changed the name of the app to Snapchat and found that people was using it to speak visually as early as 2017, authorities identified Snapchat as the sexual exploitation tool of pedophiles in the case of a 12-year-old girl identified in court only as LW shows the danger for two years and media when a predator sexually harassed her by sending her sexually explicit images and videos through Snapchat.
The man admitted that he only used Snapchat with LW and not other platforms because The date knew the chats would disappear, did you or did you and everyone else on Snap really not realize that the platform was a perfect tool for predators sexual? Senator, that behavior is disgusting and reprehensible. We provide reporting tools within the app so that people who are being harassed or who you know have had inappropriate sexual content shared can report you in the case of harassment or sexual content, we generally respond to those reports within 15 minutes so that we can provide help when LW, the victim sued Snapchat, her case was dismissed under Section 230 of the communications decency act, do you have any doubt that if the prospect of civil liability for facilitating sexual exploitation had been removed, Would the company have implemented even better safeguards?
Senator, we already work extensively to proactively detect this type of behavior. We make it very difficult for predators. To find teens on Snapchat there are no public friend lists or public profile photos. When we recommend friends for teens, we make sure they have several friends in common before making that recommendation. We believe those safeguards are important to prevent predators from misusing them. Our platform, Mr. Citron, according to the Discord website, requires a proactive and automated security approach only on servers with more than 200 members. Smaller servers rely on server owners and community moderators to define and enforce behavior.
So how do you defend a security approach that relies on groups of fewer than 200 sexual predators to report each other for things like grooming, cesam swapping, or sexual torture chair. Our goal is to remove all that content from our platform and ideally prevent it from appearing. first of all or from people who engage in these types of horrible activities um we implement a wide range of techniques that work on all surfaces in our um on Discord um I mentioned that we recently launched something called teen safety assist that works everywhere and is on By default, for teen users, that type of act acts as a friend that lets them know if they are in a situation or talking to someone that may be inappropriate so they can tell us about it and block that user.
So, Mr. Citron, if that if we were working, we would not be here today, Mr. President, this is an ongoing challenge for all of us, that is why we are here today, but we have, 15% of our company is focused on trust and security, of which This is one of our main problems: there are more people than we have working in marketing and promotion of the company, so we take these problems very seriously, but we know it is a challenge I continue and look forward to working with you and collaborating with our technology peers. and nonprofit organizations to improve our approach.
I certainly hope so, Mr. Chu, your organization and his business are one of the most popular among children, can you explain to us what you are doing in particular and if you have seen any evidence of cessam in your business? Yes, senator, we have a strong commitment to investing in trust and security, and as I said in my opening. statement I intend to invest over $2 billion in trust and safety this year alone we have 40,000 safety professionals you know working on this issue we have created a specialized child safety team to help us identify special Iz problems, horrible problems like materials like those you I have mentioned that if we identify any on our platform and proactively detect them, we will remove them and report them to nickm and other authorities.
Why does Tik Tock allow children to be exploited for commercialized sexual acts? Senator. Respectfully, I don't agree with that. characterization our live streaming product is not for anyone under 18, we have taken steps to identify anyone who violates that and removed them from using that service at this point, I'm going to turn to my ranking member, the Senator Graham. thank you miss Mr Citron you said we need to start a discussion be honest with you we have been having this discussion for a long time we need to get a result not an argument do you agree with that senior member?
Okay, this is a topic we have also been very focused on since we started our company in 2015, but this is the first time, are you familiar with the earned act written by me and C? Bloom a little, yeah, okay, do you support me? that um we um like yes or no uh we are not prepared to support it today but we believe that we support the seesam ACT um the stop cesam act um we are not prepared to support do you support the shield act we uh believe that the cyber information line, do Do you support her, yes or no? uh we believe that the cyber information line and Ne, I will take that as no, the project to save children, do you support it, um, we believe that I will take it to Sea no, the report law, do you support it?
Ranking Member Graham, we look forward to having discussions with you and your team to pass the bill that will solve the problem. Do you support eliminating Section 230 liability protections for social media companies? That section 230 needs to be updated, it is a very old law. Do you support repealing it so that people can sue if they believe they are harmed? I believe that section 230 as written, while it has many disadvantages, has allowed for innovation. on the Internet, which I think was largely thanks a lot, so here you go. If you wait for these guys to solve the problem, we're going to die waiting.
Mr. Zucker, ber, sir, try to be respectful here, the representative. from South Carolina Mr. Guffy's son was caught in a sexual extortion ring in Nigeria using Instagram. They extorted him, paid him money that was enough and he committed suicide using Instagram. What would you like to tell him? It's awful. I mean, no one should. You have to go through something like that, you think he should be allowed to see you. I think they can sue us. I think you should and he can't, so the bottom line here, folks, is that this committee is done talking. We passed. bills unanimously that in their different ways and look who did this Graham Blumenthal Durban Holly cloar cornin cornin cloar Blackburn and oof I mean we have found common ground here that is just amazing and we have had hearing after hearing Mr.
President and The bottom line The line is: I have come to the conclusion, gentlemen, that you are not going to support any of this Linda, how do you say your last name? Yak Kino. Do you support the act of winning it? We strongly support collaboration to elevate industry practices. to support do you support winning in English do you support winning yes or no we do not need to speak twice we hope to support and continue our conversations Okay, but you have done it, you are right to win Act is important, in reality you can lose your responsibility in the productions when children are exploited and you did not use best business practices.
Look at the Earn It law it means you have to earn responsibility in production, they give it to you no matter what you do to the members of this. committee, the time has come to ensure that the people holding the signs can sue on behalf of their loved ones. Nothing will change until the courtroom door is open to social media victims. 2 billion dollars, Mr. Jew, how much what percentage is of what? you made last year, Senator, it's a significant and growing investment as a private company, we don't pay shingay taxes. I mean, 2% is the percentage of your income.
Senator, we are not ready to share our finances publicly. Well, I just think $2 billion. It sounds like a lot unless you make 100 billion, so the point is you know when you tell us you're going to spend $2 billion great, but how much do you do to know that it's all about eyeballs? Well, our goal is to draw attention to you. and it's just not about kids, I'm talking about the damage being done, do you realize Mr. Chu that your Tik Tock representative in Israel resigned yesterday? Yeah, I know, okay and he said I quit Tik Tock we live on. a time when our existence as Jews and Israel is under attack and in danger.
Multiple screenshots taken from Tik Tok's internal employee chat platform known as lar show how Tik Tok trust and security officials celebrate the barbaric acts of Hamas and other Iranian backers. Terrorist groups, including hoodies in Yemen. Senator. I need to make it very clear that prama content and hate speech are not allowed in our country. Why did you resign? Why did you resign? Senator. We also do not allow any resignation. Do you know why he resigned? don't allow this, we will investigate my question: did he resign? I'm sure he had a good job. He quit a good job because he believes his platform is being used to help people who want to destroy the legal state and I'm not telling. you want that Mr.
Zuckerberg I'm not saying that you as an individual want any of the damages. I am saying that the products you have created with all the advantages have a dark side. Mr. Citran, I'm tired of talking, I'm tired of having arguments. we all know the answer here and here is the definitive answer stand behind your product go to the American courtroom and defend your practices open the courthouse door until you do nothing will change until these people can be sued for the harm they are causing you It's all talk. I am a Republican who believes in free enterprise, but I also believe that every American who has been wronged should have someone to turn to to complain.
There is no commission to go to that can punish you. There is not a single law in the book. C, you oppose everything we do and you can't be sued, that has to stoppeople, how do you expect people in the public to believe that we are going to help their families if we don't have some system or combination of systems to hold these people accountable because despite all the advantages, the dark side is too big to live with with him, we don't need to live this way as Americans, thank you Senator Graham, uh, Senator Kachar is next, she has been quite a leader on the issue, uh uh, for quite some time on the shield bill and with Senator Corin on revenge porn legislation Senator Kicher thank you very much Chairman Duran and thank you Ranking Member Graham for those words I couldn't agree more for too long we have seen social media companies make the blind eye.
When kids joined these platforms in record numbers, they used algorithms that push harmful content because that content became popular. They provided a venue, maybe unknowingly at first, but for traffickers to sell deadly drugs like fentanyl, our own head of our DEA has said that they've basically been captured by the cartels in Mexico and China, so I strongly support, first of all Firstly, the Stop Cesam bill. I agree with Senator Graham that nothing is going to change unless we open up. the courtroom doors I think time is up for all this immunity because I think money speaks even louder than what we talk about here two of the five bills, as noted, are my bills with the Senator Corin, one actually passed the Senate but is awaiting action. in the house, uh, but the other one is The Shield law and I support, I appreciate the XS who support that bill, this is about revenge porn, uh, the FBI director testified before this committee that there have been more of 20 child suicides attributed to online revenge porn alone, last year, but for those parents and those families, this is for them about their own child, but it's also about making sure this doesn't happen to Other children.
I know this because I've talked to these parents, parents like Bridget Noring. from Hastings Minnesota who is there today Bridget lost her teenage son after she took a fentanyl pill she bought on the internet Amy Neville is also here on the platform she got the pill Amy Neville is also here her son Alexander was only 14 years old when he died later taking a pill that he didn't know was actually fentanyl, we are starting a law enforcement campaign, one pill kills in Minnesota, going into schools with sheriffs and authorities, but the way to stop it is yes, at the border and at points of entry, but we know that 30% of people who obtain fentanyl obtain it from the platforms, while social media platforms generated 11 billion in revenue in 2022 from advertising directed to children and teens, including nearly two billion in user-derived advertising revenue. under 12 years old, when a Boeing plane lost a door in mid-flight several weeks ago, no one questioned the decision to ground a fleet of more than 700 planes, so why aren't we taking the same kind of decisive action on the danger of these platforms when we know that these children are dying, we have bills that have gone through this incredibly diverse committee when it comes to our political views, that have gone through this committee and should go to the floor, we should finally do something about liability and then we should Let's move on to some of the other issues that many of us have worked on when it comes to app store fees and when it comes to Monopoly behavior and self-preference, but I'm going to continue with This is a fact of today: a third of the fentanyl cases investigated over 5 months had direct links to social networks, that is, to the DEA.
The facts between 2012 and 202, two cyber reports of online child sexual exploitation increased from 415,000 to more than 32 million and, as I noted at least. 20 victims committed suicide in sexing cases, so I'm going to start with that, with you, Mr. Citron, my bill with Senator Corin The Shield Act includes a threat provision that would help the protection and accountability of those that are threatened by these predators. little kids get a photo and send it they think they have a new girlfriend or a new boyfriend ruins their life or they think it will be ruined and they commit suicide so can you tell me why you don't support the shield law?
Senator, we. We believe it is very important that teenagers have a safe experience on our platforms. I think the part of strengthening law enforcement's ability to investigate crimes against children and hold bad actors accountable is for you to stay open and be supportive. We would very much like to have conversations with you. We are open to continue discussing. We welcome legislation and regulation. You know this is a very important issue for our country and you know we have been prioritizing the safety of the team from what I just said. I'm much more interested in whether you support it because there's been a lot of talk at these hearings and popcorn thrown and stuff like that and I just want to do this.
I'm so tired of this it's been 28 years since the internet. We haven't passed any of these bills because they all talk double talk, it's time to pass them and the reason they haven't passed is because of the power of your company, so let's be very clear about this, so whatever you say it matters. your words matter um um Mr. Chu, I am a cosponsor of the President Derad Stop Cesam Act of 2023 along with Senator Hol, the Republican leader, I believe that, among other things, it empowers victims by making it easier for them, as technology companies, to remove related material and images from their platforms, why would you not support this bill?
Senator, we greatly support you. I think his spirit is very aligned with what we want to do. There are questions about implementation that I think companies like us and some other groups face. we have and we look forward to asking them and of course if this legislation is law we will follow it Mr. Spiegel I know we spoke in advance I appreciate your firm's support for the Cooper Davis Act which will eventually be a bill. with Senator Shaheen and Marshall, which will allow law enforcement to do more when it comes to fentanyl. I think you know what a problem this is, Devon and the Hastings teens.
I mentioned that her mother here suffered from dental pain and migraines, so she bought what she thought she was. a perco placed on snap, but instead bought a counterfeit drug lace with a lethal dose of fentanyl Leal while his mother, who is here with us today, said that all the hopes and dreams we had as parents for Devon were erased in the blink of an eye and No mother should have to bury her child. Talk about why you support the Cooper Davis Act. Senator. Thank you. We strongly support the Cooper Davis Act and believe it will help you go after the cartels and take more dealers off the streets to save more lives.
Okay, are there others who support that bill on this? Not okay, the last one, Mr. Zuckerberg in 2021. The Wall Street Journal reported on internal meta-investigation documents asking why we care among these were internal documents. I am citing the documents and responding to your own. asks citing internal meta emails, they are a valuable but untapped audience in a Commerce audience. I'm also on that committee. I asked the head of global security at meta. Why are 10-12 year olds so valuable to meta? She responded that we do not knowingly try to recruit people who are not old enough to properly use our apps when the 42 state attorneys general, Democrats and Republicans, presented their case, this statement said.
They were some inaccurate examples in 2021, you received an email, Miss Davis, uh, from Instagram's director of research saying that Instagram is investing in experiences targeting youth approximately 10 to 12 years old in an instant message from February 2021, one of his employees wrote that goal is working to recruit Jen Alha. before they reach adolescence, a 2018 email that circulated within meta says that it was informed that children under 13 will be instrumental in increasing the acquisition rate when users turn 13, explain that with what I heard in that testimony at the Commerce hearing that they were I'm not being objective and just asking again how the other Witnesses were asked why their company doesn't support the Stop C Sam Act or the Shield Act.
Sher Senator. I'm happy to talk to the two of us who we had internal discussions with about whether I should create a kids version of Instagram like the kids versions of YouTube and other services. We haven't actually moved forward with that and currently have no plans to, so I can't speak directly to the exact emails you cited, but it sounds to me like they were deliberations on a project that people internally thought was important and didn't. We ended up moving forward, okay, but what about the bills, what are you going to say about the two bills?
Sure, generally what I mean is my position on the bills is that I agree with the goal of all of them. There are most things I agree with within them. There are specific things you would probably do differently. We also have our own legislative proposal for what we think it would be. it's more effective in terms of helping the Internet across the various companies, it gives parents control over the experience, so I'm happy to go into detail about any of those, but ultimately, I mean, I think this again, Well. I think these parents will tell you that things haven't worked, just giving the parents control, they don't know what to do, it's very, very difficult and that's why we're coming up with other solutions that we think are much more helpful. law enforcement, but also this idea of ​​finally getting something done on CU accountability.
I just think that with all the resources you have, you could really do more than what you're doing or these parents wouldn't be after you right now. in this Senate hearing room, thank you, Senator, can I talk about that? Do you want G to come back later? Please go ahead. I don't think parents should have to upload identification or prove that they are parents of a child on every app their children use. I think the right place to do this and a place where it will actually be very easy for it to work is within the app stores, which, as I understand it, are already Apple and Google or at least Apple. already requires parental consent when a child makes a payment with an app, so it should be pretty trivial to pass a law that requires parents to have control every time a child downloads an app and offers consent to that application. and the research that we've done shows that the vast majority of parents want that and I think that's the kind of legislation, plus some of the other ideas that you all have that would make this much easier for parents.
Just to be clear, I remember one mother telling me that with all these things she could maybe do, she can't understand that it's like a faucet overflowing in a sink and she's out there with a mop while her kids are cleaning. getting addicted to more and more different apps and they are exposed to material, we need to make this simpler for parents so they can protect their children and I don't think this is the way to do it, I think the answer is what Senator Graham has been talking about, which is to open up the hallways of the courtroom, so it's up to you to protect these parents and protect these children and then also pass some of these laws that make things easier for law enforcement. order. thank you Senator Cloar, we are going to try to stick to the seven minute rule, it didn't work very well, but we are GNA.
I will also try to give the other side additional time, Senator Corin, there is no doubt that Your platforms are very popular, but we know that while here in the United States we have an open society and a free exchange of information, there are authoritarian governments, there are criminals who They will use your platforms for the sale of drugs, sex and extortion. and the like and um, Mr. Chu, I think your company is unique among those represented here today because of its ownership, uh, of bit dance, a Chinese company, and I know that there have been some steps that you have taken to close the wall. data collected here in the United States, but the fact of the matter is that under Chinese law and Chinese national intelligence laws, all information accumulated by a company in the People's Republic of China must be shared with Chinese intelligence services um bit dance uh the As I understand it, the initial launch of Tik Tock was in 2016.
These efforts that it made with Oracle under the so-called Texas project to block US data were in 2021 and apparently they supposedly completely blocked in March 23 what happened with all the data. that Tik Tock collected before that Senator, thanks from theAmerican users, I understand that Tik Tok is owned by B dance, which is majority owned by Global Investors and we have three Americans on the board of five, um, you are right to point out that over the last three years we have spent billions of dollars in tax creation for projects, which is an unprecedented plan in our industry, the firewall wall of data protected from the rest of our staff and I'm asking about all the data that you collected before that event uh yes Senator, we have started a data deletion plan.
I talked about this a year ago. We have completed the first phase of removing data across our data centers outside of the Oracle Cloud infrastructure and are getting started. Phase two where we will not only remove from the data centers, but we will hire a third party to verify that work and then we will enter, for example, the employees, the laptops that they work, to also remove all the data collected by Tick Tock. Before the project that Texas shared with the Chinese government, in accordance with the National Intelligence laws, it is that country, Senator, the Chinese government has not asked us for any data and we have never provided it.
Your company is once again unique among those represented here. today, because it is currently being reviewed by the committee on foreign investment in the United States, that is correct Senator, yes, there are ongoing discussions, uh, and much of our work project in Texas is based on the year of the discussions with many agencies under the CFU umbrella. Well, cifas is specifically designed to review foreign investments in the United States for national security risks. Right, yes, I do, and your company is currently being reviewed by this Treasury Department Interagency Committee for potential national security risks. Senator, this review is about a music acquisition, which is an acquisition that was made many years ago, I mean, is this a casual conversation or are you actually providing information to the Treasury Department about how your platform works to assess potential risk? for national security?
Senator. It's been a lot of years in two administrations and a lot of discussions about what our plans are like, how our systems work, we have a lot of solid discussions about a lot of details. 63% of teenagers, from what I understand, use Tik Tock. true senator, I can't verify that we know that we are popular among many age groups, the average age in the US today for our user base is over 30, but we are aware that we are popular and you He resides in Singapore with his family, right, yeah. I have uh, I live in Singapore and I work here in the United States as well and your children have access to Tik Tock in Singapore Senator uh, if they lived in the United States, I would give them access to our under 13 experience, my children are below the age of 13 my question is in Singapore, do they have access to Tik Tok or is it restricted by national law?
We don't have a PG-13 experience in Singapore, we do in the US because we are considered a mixed audience app. and uh, we created a PG-13 experience in response to a Wall Street Journal article published yesterday directly contradicting what your company has publicly stated. According to the magazine, Texas project employees say that user data, including user emails, date of birth, IP addresses, remains. to be shared with the staff of Bite Dance again owned by a Chinese company, do you argue that yes, senator? There are many things about that article that are inaccurate. What it does well is that this is a voluntary project that we built, we spent billions of dollars there.
There are thousands of employees involved and it is very difficult because it is unprecedented. Why is it important that data collected from us users be stored in the United States? Senator, this was a project that we built in response to some of the concerns that were raised. by members of this committee and others and that was due to concerns that the Chinese Communist Party could access data stored in China, in accordance with National Intelligence laws, Senator correct, we are not the only company doing business, uh uh. You know you have Chinese employees, for example, we're not even the only company in this room that hires Chinese nationals, but to address some of these concerns we've moved the data to Oracle Cloud infrastructure and created a 2,000-person team to Supervise.
We separate the management of the data based here from the rest of the organization and then open it up to third parties like Oracle and we will bring in others to give them third-party validation. This is an unprecedented excess. I think we are unique. by taking even more steps to protect user data in the United States, well, you have questioned the Wall Street Journal story published yesterday. Are you going to do some kind of investigation to see if there is any truth to the allegations made in the article? or you will simply discard them directly. Oh, we're not going to rule them out, so we have ongoing safety inspections not only by our own staff but also by third parties to ensure that the system is rigorous and robust, no system that we have. we can all build is perfect, but what we need to do is make sure that we are always improving it and testing it against bad people who may try to circumvent it and if someone violates our policies within our organization, we will take disciplinary action against them. thank you Senator Corin Senator Coons thank you President Durban um first I would like to start by thanking all the families that are here today all the parents that are here for a child that they have lost all the families that are here because they want us to see you and to meet your concern has contacted each of us in our offices expressing your pain your loss your passion and your concern and the public watching cannot see this the companies' witnesses can see it but this room is full as far as it can go hearing and as this hearing began, many of you picked up and held photographs of your beloved and lost children.
I benefit from and participate in social media as do many committee members and our world there now. a majority of people on Earth participate and benefit in many ways from one of the platforms that you have launched, lead or represent and we have to recognize that there are some real positives to social media: it has transformed modern life, but also has had enormous impacts on families and children at the United Nations and there are a whole series of bills advocated by members of this committee that try to address illicit drug trafficking, trafficking in illicit child sexual material, things that are facilitated on their platforms that can lead to self-harm or suicide, so we have heard from several of the leaders of this committee, the president and senior senators and very talented and experienced, the framework that we are looking at is consumer protection when there is some new technology that we implement regulations to make sure it's not too damaging, as my friend Senator Kashar pointed out, a door flew off a plane, no one was hurt, and yet the entire Boeing fleet of that type of plane was grounded and an appropriate federal agency for that purpose did an immediate safety review.
I'm not going to point out the other laws that I think are urgently needed for us to pass and approve, but rather the central issue of transparency if you are a company that makes a product that It is supposedly addictive and harmful. One of the first things we look for is safety information that we try to provide to our constituents, to our consumers, warning labels that help them understand what the consequences of this product are and how to use it safely or not, as they have heard clearly from some of my colleagues. if you sell an addictive, defective, harmful product in this country in violation of regulation and warnings, you will be sued and what sets the platforms apart as an industry is that most of the families that are here are here because there were not enough warnings and They cannot sue effectively.
Let me elaborate for a moment if I can because each of their companies voluntarily discloses information about content and security. The investments you make and the actions you take. There was a question, I think it was asked earlier by Senator Graham about um Tik. Tok, I think Mr. Chu, you said to invest 2 billion dollars in security. My background memo said their global revenue is $85 billion. Mr. Zuckerberg, my background memo says that you are investing five billion in security and goal and your annual revenue is on the order of 116 billion, so what matters, you can hear some expressions from the parents in the audience, what matters It's relative numbers and absolute numbers, your data friends, if anyone is into this.
The world that understands the data is you, so I want to analyze whether these voluntary measures of disclosure of content and harm are enough or not, because I would say that we are here because we do not lack better information, how can policymakers know if The protections you have testified about the new initiatives the startup programs the monitoring and removals are really working how can we meaningfully understand how big these problems are without measuring and reporting data? Mr. Zuckerberg, your testimony referenced a National Academy of Sciences study that said at the population level there is no evidence of harm to mental health, well, it may not be at the population level, but I'm looking at a full room. of hundreds of parents who have lost children and our challenge is to take the data and make good decisions. about protecting families and children from harm, so let me ask you what your companies do or don't report and I'm going to particularly focus on your content policies around self-harm and suicide and I'm just going to ask a series of Yes Questions or not and what I mean is: do you, Mr.
Zuckerberg, disclose enough about your policies prohibiting content about suicide or self-harm? Do you report an estimate of the total amount of content, not a percentage of the total, not a prevalence number, but the total amount of content on your platform that violates this policy and do you report the total number of views that the content you promote self-harm or suicide that violates this policy comes to your platform? Yes, Senator, we were pioneers in reporting quarterly on the application of our community standards across all of these different categories of harmful content, we focus on the prevalence that you mentioned because what we focus on is what percentage of the content we remove.
Zu, I'm going to interrupt, you are very talented. I have very little time. left I'm trying to get an answer to a question, not as a percentage of the total because remember it's a huge number so the percentage is small but does it report the actual amount of content and the number of views the content received of self-harm? I think we focus on prevalence, right, you don't, M yarina, yes or no, you report it or not, Senator, what reminder? We have less than one% of our users who are between 13 and 17 years old. Absolute number of how many images Number of posts and accounts we have deleted in 2023.
We have deleted almost a million posts. Regarding mental health and self, Mr. Chu, do you reveal the number of such appearances? of content and how many are viewed before being removed Senator, we disclose the number we removed based on each violation category and how many of them were proactively removed before it was reported Mr. Sple, yes Senator, we disclose Mr. Citan , if we do it. I have three more questions that I would love to answer if I had unlimited time. I will send them for the record. The most important point is that platforms need to deliver more content about how the algorithms work and what the content does. and what are the consequences not as a whole, not at the population level, but in the real number of cases, so that we can understand the content at the end Mr.
President, I have a bipartisan bill, the transparency and accountability law of the platform co-sponsored by Senators Corin clovar Blumenthal this committee and Senator Cassidy and others are in front of the Commerce Committee, not this committee, but it would establish reasonable standards of disclosure and transparency to make sure that we do our work based on data. Yes, it is understandable that there is a lot of excitement in this field. But if we are going to legislate responsibly about the management of content on their platforms, we need to have better data. Are any of you willing to say now that you support this bill?
Mr. President, let the record show a yawning silence from the leaders of the social media platforms thank you, thank you Senator Coons, we are in one of two, the first of two roll calls, so please understand that if some of the members leaveand they come back, it's not disrespectful, they're doing their job, Senator Lee, thank you. You, Mr. President, Tragically Survive Sexual Abuse Survivors are often repeatedly victimized and revictimized over and over again by having non-consensual images of themselves on social media platforms. There is a study by Nickm that indicated that there was a case. um from cesam that reappeared over 490,000 times after it was reported after it was reported, so we need tools to deal with this, frankly, we need laws to require standards so that this doesn't happen, so that we have a systematic way of get rid of these things because there is literally no plausible justification, there is no way to defend this, one tool that I think would be particularly effective is a bill that I will introduce later today and I invite all members of my committee to join me, it's called the protection law, the protection law would, in relevant part, require websites to verify the age and verify that they have received consent of each and every person who appears on their site in pornographic images and also requires that platforms have meaningful processes for a person seeking to have images of him or herself removed in a timely manner, Miss Yaro, based on your understanding of the existing law, what would it take for a person to remove those images? let's say from x i Senator Lee, thank you It sounds like what you are going to introduce into the law in terms of user consent and the entire ecosystem sounds exactly like part of the philosophy of why we support the Shield Law and no one should have to endure images not consensual. be shared online, yes, and W without that, no laws in place, and it's great whenever a company like the one you described with yours wants to follow those steps, it's very helpful, it can take a lot more time than it should and Sometimes it does. to the point where someone had images that you shared 490,000 times after it was reported to the authorities and that is deeply worrying, but yes, the shield law would work in conjunction with, it is a good complement to the shield law, sir Zucker, let's turn to you.
Next, as you know, I have a strong concern for privacy and believe that one of the best protections for an individual's privacy online involves end-to-end encryption. We also know that a lot of cesam grooming and sharing happens in end-to-end encrypted systems, tell me. Does Meta allow youth accounts on its platforms to use encrypted messaging services within those apps? Sorry, Senator, what do you mean under 18 under 18? We allow people under 18 to use WhatsApp and we allow that. to be encrypted, yes, do you have a minimum age level where they are not allowed to use it? Yes, no, I don't think we allow people under 13 years old.
Yes, how about you, Mr. Citron, on Discord? Do you have um? Do you allow children to have accounts to access encrypted messages? um Discord cannot be used by children under 13 years of age and we do not use final encryption for text messages. You know, we think it's very important to be able to respond to well-formed requests from law enforcement authorities, um to U, and we're also working proactively on Building Technology. We are working with a nonprofit called Thorn to create a grooming classifier so that our teen safety assistance feature can identify these conversations if they might be happening so we can intervene and give those teens tools to get out of it. situation or even report those conversations and those people to the authorities and an encryption, as much as it may be useful in other places, can be especially harmful. if you are on a site where you know children are being groomed and exploited if you allow children to access an end-to-end encryption enabled app that can be problematic now let's get back to you M for a moment Mr.
Zuckerberg Instagram recently announced that it's going to restrict all teens' access to eating disorder material, suicidality-themed material, self-harm content and that's fantastic, that's cool, what's weird, what I'm trying to understand is why Instagram is um . just restricting is restricting access to explicit sexual content, but only for adolescents aged 13 to 15, why not also restrict it for young people aged 16 and 17? Senator, I understand that we do not allow explicit sexual content on the service for people of any age um, um, how's that going? You know, our uh, our prevalence metric suggests that I think about 99% of the content that we removed was able to be automatically identified using the AI ​​system, so I think our efforts on this, while not perfect, I think they're leading. in the industry.
The other thing you asked about was self-harm content, which is what we recently restricted and made that change. I think the state of science is changing a little bit. We used to believe that when people thought about self-harm it was important that they were able to express it and get support, and now most of the thinking in the field is that it's best not to show it. that content, which is why we recently decided to restrict its appearance for those teenagers. Okay, is there a way for parents to make a request about what their kids can or can't see on your sites?
There are many parental controls. I'm not sure if they exist. I don't think we currently have control over the themes, but we do allow parents to control how much time their children are on the site and a lot of it too. It is based on a type of monitoring and understanding of what adolescents experience and what they interact with. Mr. CIT Discord allows pornography on their site. Reportedly, 17% of minors using Discord have now had online sexual interactions on its platform. 177% and 10% have those interactions with someone the minor believed was an adult restricts um miners' access to Discord servers that host pornographic material on them uh Senator, yes, we restrict monance's access to flagged content for adults um Discord also does not recommend content to people Discord is a chat app, we do not have a feed or an algorithm that augments content, so we allow adults to share content with other adults in spaces labeled for adults and we do not allow teenagers access that content.
Well, I see my times have expired, thank you, welcome everyone, we are here at this hearing, because as a collective, your platforms really suck at policing themselves, we found out here in Congress with fenel trafficking and others drugs facilitated through platforms, we see and hear it. about this here in congress with the harassment and intimidation that takes place through your platforms we see and hear it here in congress regarding child pornography, sexual exploitation and blackmail and we are sick of it, I There seems to be a problem with liability because these conditions continue to persist in my opinion, section 230 which provides immunity from lawsuits is a very important part of that problem if you look at where bullies have been taken to heal recently, if that Dominion finally gets justice against Fox News after a long campaign. to try to discredit the electoral equipment manufacturer or if it is the fathers and mothers of the Sandy Hook victims who finally obtain justice against Infow Wars in their campaign of trying to make people believe that the massacre of their children was a farce put together by them o Even now, more recently, with a writer receiving a very significant judgment against Donald Trump after years of intimidation and defamation, an honest court has proven to be the place where these things are resolved and I will only describe one case, if I may, His name is Doe. v.
Twitter, the plaintiff in that case was blackmailed in 2017 for sexually explicit photos and videos of himself, then 13 or 14 years old. A compilation video of multiple camera videos appeared on Twitter in 2019. A concerned citizen reported that video on December 25, 2019, Christmas Day on Twitter. took no action, the plaintiff then, a minor in high school in 2019, found out about this video from his classmates in January 2020, you are a high school kid and suddenly, that's a day that it is difficult to recover, he finally became suicidal, he and his parents contacted the authorities and Twitter to remove these videos on January 21 and again on January 22, 2020 and Twitter finally removed the video on January 30, 2020 once that federal authorities got involved, it's a pretty unpleasant set of facts and when the family sued Twitter for all those months of refusing to remove this child's explicit video.
Twitter invoked section 230 and the District Court ruled the claim was barred. There is nothing in that set of facts that tells me that section 230 performed any public service. In that sense, I would like to see very substantial adjustments to Article 230 so that the honest courtroom that brought relief and justice to Egene Carol after months of defamation that brought silence, peace and justice to the parents of the Sandy Hook children after of months of smear and intimidation by Info Wars and Alex Jones and bringing significant justice and an end to the Fox News smear campaign to a small company that was busy simply making voting machines, so, my time is running out , I'll resort to uh, I guess.
Senator Cruz is next, but I would like each of his companies to put in writing what exemptions from Section 230 protection they would be willing to accept the disclosure considering the factual situation in do versus Twitter considering the enormous damage that was done to that young man and that family by the lack of response from this enormous platform for months and months and months and months again think about what it is I would like to be a high school kid and have those things in the public domain and that the company that keeps them in the public domain reacts in such a selfless manner.
Okay, could you write that for me? One, two, three, four, five, yes, done. Senator Cruz, thank you, Mr. President, social media is a very powerful tool, but we are here because every parent I know and I think every parent in America is terrified by the garbage that is directed at our children. I have two teenagers at home and phones. They have portals that lead to predators, cruelty, harassment and self-harm and each of their companies could do much more to prevent it. Mr. Zuckerberg in June 2023. The Wall Street Journal reported that Instagram's recommendation systems were actively connecting pedophiles to accounts they advertised. the sale of child sexual abuse material in many cases these accounts appear to be run by underage children who often use keywords and emojis to advertise material to obtain material in other cases the accounts included indications that the victim was being subjected to sex trafficking Now I know that Instagram has a team working to prevent the abuse and exploitation of children online, but what was particularly troubling about the Wall Street Journal expose was the degree to which Instagram's own algorithm promoted the ability to victim discovery for pedophiles seeking child abuse material;
In other words, this material was not just living in the dark corners of Instagram Instagram was helping pedophiles find it by promoting graphic hashtags including hash fart and hash Preen sex to potential buyers. Instagram also showed the following warning screen to people searching for child abuse material: These results may contain images of child sexual abuse and then gave users two options: get resources or view results anyway Mr. Zuckerberg, do you? what the hell was I thinking? Well, senator, um, the basic science behind this is that when people search for something that is problematic. it's often useful, rather than just blocking it, to help direct them towards something that, um, might be useful for them to get help in what I understand, get resources in what sensible universe, is there a link for the C results of anyway?
Well, because we might be wrong, we try to trigger this warning or we try um when we think there's some chance that the results are wrong, let me ask you how many times this text screen was shown. I don't know, but he You don't know why you don't know. I don't know the answer to that that comes to mind, but well, you know what, Mr. Zuckerberg, it's interesting, you say that you don't know from the beginning because I asked it in June 2023 in an oversight letter superficial and his company refused to respond. Will you commit right now to answer this question within 5 days for this committee?
We will follow up. I mean, yes, no, we'll follow up. I know lawyers write statements saying we're not going to respond. Will you tell us how many times this screen was shownwarning? Yes or no Senator. I will investigate it personally. I'm not sure if we're okay, so he refuses to do it. answer that let me ask you this how many times an Instagram user who received this warning that you are seeing images of child sexual abuse how many times that user clicked on see results anyway I want to see that Senator, I'm not sure? we do store that, but I'll look into it personally and we'll follow up later and what follow up did Instagram do when you have a potential pedophile clicking on I'd like to watch child pornography, what did you do next when that happened, Senator?
I think an important part of the context here is that any context that we think is a child sexual Zuber that is called a question: what did you do next when someone clicked on you, may be getting images of child sexual abuse and they do click see results anyway, what was your next? Step you said I could be wrong. Did anyone examine whether this was actually child sexual abuse material? Did anyone report that user? Did someone go and try to protect that child? What did he do next? Senator, we removed everything we believe to be sexual abuse material.
Senator, I don't know if we're following every single search result, but you inform the people who wanted it, do you want me to answer your question? question I want you to answer the question I ask you: did you inform the people who clicked see results anyway? That's probably one of the factors we use in reporting and overall we've reported more people and done more reports like this for Nicki, the National Center for Missing Exploited Children, it's more than any other company in the industry. We proactively push our services to do this and I believe it's over 26 million reports, which is more than the entire rest of the industry combined.
So, Mr. Zuckerberg, his company and all social media companies must do much more to protect children. Okay, Mr. Chu, in the next few minutes I want to address you who are familiar with China's 2017 National Intelligence Law, which states cite all organizations. and citizens will support, assist, and cooperate with National Intelligence efforts in accordance with the law and protect the work of National Intelligence. Secrets they know. Yes, I am familiar with this Tick Tock. It is owned by Bite Dance. Bite Dance is subject to the law. The Chinese companies that b dance owns yes, they will be subject to this, but Tik Tok is not available in mainland China and the senator, as we talked in his office, we built the Texas project to put this out of their reach, so bit dance is subject to the law.
Now, under this law that says we will protect the secrets of National Intelligence work that you know, it forces people subject to the law to lie to protect those secrets. It's that correct. I can't comment on that. What I said again is that we, because we have to protect. not those secrets, senator, we Tik Tok is not available in mainland China, we have moved the data to an American clct that is controlled by the bite dance, which is subject to this law, now you said before, you said and I wrote this , we have not been asked for any data by the Chinese government and we have never provided it, I'm going to tell you and I told you this when you and I met last week in my office.
I don't believe them and I'll tell you the American people don't. If you look at what's on Tik Tok in China, you're promoting educational science and math videos for kids and limiting the amount of time kids can be on Tik Tok in the United States. , you're promoting self-harm videos for children and anti-Israel propaganda why is there such a dramatic difference Senator that's just not accurate uh there's no difference between what kids see in China and what kids here say Senator Tik Tok no It's available in China it's a separate experience there what I'm saying, but you have a company that is essentially the same except it promotes beneficial materials instead of harmful materials.
That is not true. We have a lot of science and math content here on Tik Tok. There is so much in St. Fe, let me leave it. Point, let me point out to this Mr. Chu, there was a report recently that compared the hashtags on Instagram to the hashtags on Tik Tac Tik Tok and what trends and the differences were surprising for something like the Taylor Swift hash or the Trump hash, the researchers they found about two Instagram posts for everyone on Tik Tok that's not a dramatic difference that difference jumps to 8 to one for the weer hash and jumps to 30 to 1 for the tiet hash and jumps to 57 to 1 for the tianan square hash and jumps to 174 to 1 for Hong Kong protest hash why on Instagram can people put a Hong Kong protest hashtag 174 times compared to Tik Tok?
What censorship is Tik Tok doing at the request of the Chinese government? None Senator who analyzes FL the analysis is flawed has been superseded by other external sources such as the KO Institute fundamentally some things happen here, not all the videos have hashtags, that's the first thing, the second thing is that you can't selectively choose some words within a certain time, why the difference between Taylor Swift and Plaza Tanan What happened in Plaza Tanan? Senator, there were massive protests during that time, but what I'm trying to say is that our users can come and post this freely, why would there be no difference to Taylor Swift or minimal difference? a big difference in Tanan Square or the Senator from Hong Kong, could you please conclude, Senator?
Our algorithm does not remove any content based simply on answering that question. Why is there a difference like I said? I think this analysis is flawed. He is choosing some words selectively. For some periods we haven't been in this, there is an obvious 174 to 1 difference for Hong Kong compared to Taylor Swift, it's dramatic. Senator Blumire, thank you, Mr. President, Mr. Zuckerberg, Lenthal, I'm sorry, thank you, I know you both, that was good enough, Mr. Zuckerberg. uh you know who Antony Davis is, right, yeah, she's one of your top leaders in September 2021, she was global head of security, right, yeah, and you know she appeared before a subcommittee of the Commerce Committee that I chaired in at that time, consumer protection subcommittee. right, yeah, and she was

testify

ing on behalf of Facebook, right, meta, but yeah, it was Facebook then, but meta now, and she told us, and I'm quoting, that Facebook is committed to creating better products for young people and doing everything do everything possible to protect your privacy. safety and well-being on our platforms and she also said that child safety is an area in which we are investing a lot.
End of quote. We now know that that statement was not true. We know this from an internal email we received. It's an email written by Nick CLE. you know who's right yeah they met him as president of global affairs and he wrote you a memo that you received correctly. It was written for you, so I can't see the email, but I'll sure assume that. that you got it that right and summed up Facebook's problems, he said quote, we are not on track to succeed on our core issues of well-being, problematic use, bullying and harassment, hookups and SSI, which means suicidal self-harm, he also said in another memo. quote, we need to do more and we are being held back by lack of investment.
This memo is dated August 28, just a few weeks before that Tiggy Davis testimony. Right, sorry, Senator. I'm not sure what the date of the testimony was. those are the dates in the emails that Nick CLE asked him for, pleading for resources to support the narrative to meet current commitments. Antony Davis was making promises that Nick CLE was trying to keep and you rejected that request for 45 to 84 engineers. to achieve well-being or security, we know he rejected it for another memo, Nick C's assistant, Tim Curn, who Nick emailed Mark referring to that earlier email to emphasize his support for the package, but lost to others pressures and priorities that I have made a calculation that those 84 potential engineers would have cost Meta about $50 million in a quarter when it made $9.2 billion, and yet you failed to meet that commitment in real terms and you rejected that request due to other pressures and priorities. an example of their own internal do not act document and is the reason we can no longer trust meta and frankly any of the other social networks to grade their own assignments.
The public and, in particular, the parents in this room know that I can no longer trust social media to provide the kind of protection that children and parents deserve and that is why passing the child safety law in line is so important Mr. Zuckerberg, do you believe you have the constitutional right to lie to a congressional senator? No, but I mean a lot of things. I would like the opportunity to CL for you in a lawsuit brought by hundreds of parents, some in this very room, alleging that you made false and misleading statements about the safety of your platform for children. not just in one pleading, but twice in December and then in January, that you have the constitutional right to lie to Congress, do you reject that presentation in court?
Senator, I don't know what presentation you're talking about, but I

testify

honestly, sincerely and I. I would also like the opportunity to respond to the previous things you showed. I have a few more questions and let me ask others who are here because I think it's important to be on record as to who will support children's online safety. act yes or no, Mr. quote um, there are parts of the act that we think are great, no, it's a yes or no question. I'll be running out of time, so I guess the answer is no, if you can.
I say yes, uh, we very much believe that the national privacy standard would be excellent, Mr. Seagull, Senator, we strongly support the Children's Online Safety Act and have already implemented many of its key provisions. Thank you. I welcome that support along with the support of Microsoft. Mr. Senator Chu with some changes we can support it in its current form, do you support it yes or no? We are aware that some groups have raised some concerns. It is important to understand it. I'll take it as Noe Mr. Yak Miss yakar Senator we support kosum We will continue to make sure that it accelerates and that it continues to provide a community for teens seeking that voice.
Mr. Zuckerberg, Senator, we support age-appropriate content standards, but could you suggest how, Mr. Zuckerberg, you support children online? Play it safe, these are public things and I'm just asking whether you'll support them or not, I think they're nuanced things. So the basic spirit is right. I think the basic ideas in it are correct and there are some ideas I would discuss on how to improve them. Unfortunately, I don't think we can count on social media as a group or big tech companies to support this measure and in the past we know it has been opposed by armies of lawyers and lobbyists, we are prepared for this fight, but I am very happy to that we have parents here because tomorrow we will have an advocacy day and the people who really count The people in this room who support this measure will turn to their representatives and senators and their voices and faces will make a difference.
Senator Schumer has committed to working with me to bring this bill to a vote and then we will have real protection for children and parents online thank you, Mr. President, thank you, Senator Blumenthal, uh, we have a vote as Senator Cotton, Have you voted on and Senator Holly, hasn't voted yet, you're next and I, I don't know? how long the voting will be open, but I will turn it over to you thank you, Mr. President, Mr. Zuckerberg, let me start with you. Did I hear you say in your keynote speech that there is no link between mental health and social media use?
Senator, what I said. I think it's important to look at the science. I know people talk widely about this as if it's something that's already been proven, and I think most of the scientific evidence doesn't support that. Really let me remind you. part of the science of your own company Instagram studied the effect of your platform on teenagers let me read you some quotes from The Wall Street Journal report on this company. Researchers found that Instagram is harmful for a significant percentage of teenagers, especially teenage girls. a quote from their own study we quote that we make body image problems worse for one in three teens here's another quote that teens blamed Instagram this is their study for increases in the rate of anxiety and depression this reaction was spontaneous and consistent in all groups that's your study Senator, we try to understand the feedback and how people feel about the services.
We can improve a minute. Their own study says it makes the lives of one in three teenagers worse. Increases anxiety and depression.That's what it says. you are here testifying to us in public that there is absolutely no link you have been doing this for years for years you have been coming in public and testifying under oath that there is absolutely no link your product is wonderful the science is full speed ahead although internally you know very well your product is a disaster for teenagers and you keep doing what you're doing well that's not true that's not true let me let me show you some other facts that I know you know wait a minute wait a minute wait a minute that's not a question that's not a question those are facts Mr.
Zuckerberg that's not a question those are not facts let me show you some more facts here are some here is information from a whistleblower who came before the Senate testified under oath in public worked for you is a high executive this is what he showed he found when he studied their products so for example, these are girls between 13 and 15 years old, 37% of them reported that they had been exposed to unwanted nudity on the platform in the last 7 days 2 4% said they had experienced unwanted sexual advances propositioned in the last 7 days 177% said they had encountered self-harm content pressured on them in the last seven days Now I know you're familiar with these statistics because you He sent an email describing everything.
I mean, we have a copy right here. My question is: Who did you fire for this? Who got fired for that? Senator. We study all this because it is important and we want to improve our services. Well, a second ago you told me that you studied it but that there was no connection. Who did you fire? Yes, I said you mischaracterized 37% of teenage girls between 13 and 15 years old. exposed to unwanted nudity in a week on Instagram, did you know, who fired the senator? That's why we are creating all the tools, Senator. I don't think that's who he fired.
I'm not going to answer that. You didn't fire anyone, right? Take any meaningful meaning, it's appropriate to talk about decisions like the IND, do you know who's sitting behind you? You have families all over the country whose children have been seriously harmed or gone missing and you don't think it's appropriate to talk about the steps you took the fact that you didn't fire any people let me ask you this let me ask you this have you compensated any of the victims ? I'm sorry, have you compensated any of the victims these girls have of you? I compensated them I don't think so, so why don't you think they deserve some compensation for what your platform has done?
Help with counseling services Help to deal with the problems your service has caused. Our job is to make sure that We create tools to help keep people safe, will it make it up to them? Senator, our job and what we take seriously is to make sure that we build industry-leading tools to find harmful controls. Remove them from money-making services and create empowering tools. parents so you didn't take any action you didn't take any action you didn't fire anyone you haven't compensated any victims let me ask you this let me ask you this there are families of victims here today have you apologized? the victims I would like you to do it now well, they are here you are on National Television would you like to apologize now to the victims who have been harmed by your product, show them the photos would you like to apologize for what you have done?
I have done to these good people. I'm sorry for everything that happened. It's awful. No one should have to go through the things their families have endured and that's why we invested so much and will continue to make industry-leading efforts. to make sure that no one has to go through the kind of things that their families have had to go through. Do you know why, Mr. Zuckerberg, why your company shouldn't be sued for this? Why can you claim to hide behind a shield of liability? He can't be held responsible, shouldn't he be held personally responsible? Will you take personal responsibility?
Senator. I think I already answered this. I mean, these are these. Try again. Will you take personal responsibility? Senator. I see my work and our company's work as creating the best tools we can to keep our community safe. Well, you are failing at that. Senator, we are making an industry-leading effort. We built an artificial intelligence tool. Foolishness. Your product is killing people. Will you personally commit to compensating the victims? You are a billion. Will you commit to compensating the victims? Will you create a compensation fund with your money? I think these are these are senators they are complicated that is not a complicated question, although that is yes or no will you create a compensation fund for victims? with your money, the money you made with these families sitting behind you, yes or no, Senator, I don't think that's my job, they are good tools, my job is to make sure that your job is responsible for what your company gives you has done.
I have made billions of dollars with the people sitting behind them. Are you here? You haven't done anything to help them. You haven't done anything to compensate them. You haven't done anything to fix things. You could do it here today and you should. Should I, Mr. Zuckerberg, before my time expires, Mr. Chu, let me ask you about your platform, why shouldn't your platform be banned in the United States of America? You are owned by a Chinese communist company or a company based in China, the editor-in-chief of your parent company. is a secretary of the communist party his company has been surveilling Americans for years according to leaked audio of more than 80 internal Tik Tock meetings employees of his China-based company have repeatedly accessed non-public data of American citizens his company has tracked journalists inappropriately gaining access to user data from their IP addresses in an attempt to identify whether they are writing negative stories about you.
Why should your platform basically be a spy arm for the Chinese Communist Party? Why shouldn't it be banned in the United States of America? I don't agree with your characterization. We have explained much of what you have said in great detail. Tik Tok is used by 170 million Americans. I know every one of those Americans is in danger from the fact that you track their keystrokes. By using the app, you track your location data and we know that employees in China who are subject to the dictate of the Chinese Communist Party can access all that information. Why shouldn't they ban it in this country?
Senator, that's not very accurate. From what you describe, we collect, we do not know, it is 100% accurate. Do you deny that Bit Dance employees in China have repeatedly accessed US data? We built a project that, you know, cost us billions of dollars to stop and we've done it. Much progress has been made and this has not stopped according to yesterday's Wall Street Journal report, even now the bite to dance workers without going through official channels have access to the private information of American citizens. I quote from the article private information of American citizens, including his date of birth, his IP address and more, he is now a senator, as we know, the media does not always get it right what we have what we have what the Chinese Communist Party does I'm not saying That's what I'm saying is we've spent billions of dollars to build this project it's rigorous it's robust it's unprecedented and I'm proud of the work that the 2000 employees are doing to protect the data it's not protected that's the problem Mr.
Chu you are not protected at all, you are subject to inspection by the Chinese Communist Party and they review your application unlike any other person sitting here and God knows I have problems with everyone here, but your application, unlike any of those, is subject to the control and inspection of a hostile foreign government that has actively trying to track the information and whereabouts of all Americans in its hands your application should be prohibited in the United States of America for the security of this country thank you Mr. Senator Herona thank you Mr. President um as we've heard children face all kinds of dangers when they use social media, from mental health harm to sexual exploitation, even trafficking, sex trafficking is a serious problem in my home state of Hawaii, especially for Native Hawaiian victims, that social media platforms are also being used to facilitate this trafficking. as the creation and distortion of ccam distribution is deeply concerning, but it is happening, for example, several years ago, a military police officer stationed in Hawaii was sentenced to 15 years in prison for producing c-cam as part of his online exploitation of a minor that began. communicating with this 12-year-old girl via Instagram, then using Snapchat to send her sexually explicit photos and to request those photos.
He later used these photos to blackmail her, and just last month the FBI arrested a neo-Nazi cult leader in Hawaii who lured victims to his Discord server, used that server to share images of extremely disturbing child sexual abuse material interspersed with neo-nazi images. Members of his child exploitation and hate group are also present on Instagram, Snapchat X, and Tik Tok, all of which they use. recruit members and potential victims in many cases, including the ones I just mentioned, their companies played a role in helping authorities investigate these criminals, but at the time of the investigation so much damage had already been done, the hearing is about how keep children safe. online and we have heard all their testimonials about seemingly impressive safeguards for young users.
Try to limit the time they spend. Needs parental consent. Do you have all these tools? Do you still treat and exploit miners online and on your platforms? continues to be rampant, almost all of their companies make money through advertising, specifically selling the attention of their users, their product is their users, as one MAA product designer wrote in an email quote, young people are the best If you want to attract people to your service young and ear early end quote, in other words, hook them, preliminary research published last month by the Harvard School of Public Health estimates that Snap generates a staggering 41% of its revenue by targeting users under 18 with Tik Tock, it's 35%, seven of the 10 largest Discord servers that attract many paid users are for games used primarily by teenagers and children.
All of this is to say that your social media companies and others make money by attracting children to their platforms, but ensuring safety doesn't make money, it costs money if it does. If you are going to continue to attract children to your platforms, you have an obligation to ensure that they are safe on your platforms because the current situation is unsustainable, that is why we are going to have this audience, but to ensure the safety of our children, that costs money, your companies cannot continue. to take advantage of young users only to look the other way when those users, our children, are harmed online, we have received a lot of feedback about section 230 protections and I think we are definitely headed in that direction and some of the five projects of law we have already passed this committee talks about limiting liability protections for you last November this is for Mr.
Zugerberg last November the privacy technology subcommittee his uro bear testimony in response to one of my questions about how to ensure that social media companies focus more on child safety, he said, he said and I and I'm paraphrasing a little bit. Mr. Bear said what will change your behavior. It is at this time that Mark Zuckerberg declares profits and these profits must be declared to the SEC, so he has to say that last quarter we earned 34 billion dollars and the next thing he has to say is how many teens experienced unwanted sexual advances on your platform Mr.
Zucker Zuckerberg, will you commit to reporting measurable child safety data in your quarterly earnings reports and call the senator? That is a good question. In fact, we already have a quarterly report that we issue and we do a call to answer questions about how we are doing. enforcing our community standards that includes not just child safety issues and it's yeah, we have a separate call where we do this, but we've led the I think you know you have to report your earnings to the SEC, will you tell them this type of data and, by the way, in numbers?
Because the senator said and others have said, but it doesn't really tell the whole story, will you report the number of teenagers to the SEC and sometimes won't you? You will even know if they are teenagers or not because they simply claim to be adults. Will you report the number of underage children on your platforms who experience unwanted cm and other typesof uh uh messages that harm them? Will you commit to quoting those numbers to the SEC, when you do your quarterly report, Senator, I'm not sure it makes that much sense to include it in the SEC filing, but we put it out publicly for everyone to see and I'd be happy to follow up and speak. about what specific metrics I think the specific things that some of the ones you just mentioned have to do with underage people on our services, we do not allow people under the age of 13 to access our service, so if we find someone under 13 years old, eliminate them from our service, I am not saying that people do not lie and that there is no one under 13 years old who is using it, but I am not going to be able to, we are not going to be able to count how many people there are because fundamentally if we identify that someone is minor we eliminate it.
I think it's very important that we get real numbers because these are real human beings, which is why all these parents and other people are here because any time a person, a young person, is exposed to this type of unwanted material and gets hooked, it's a danger to that individual, so I hope you are saying to report this type of information, if not to the SEC, then to which it is made. public I think I'm hearing that yes, it does, yes, Senator, I think we report more publicly on our app than any other company in the industry and we are very supportive of transparency measures.
I'm running out of time, Mr. Zuckerberg, but then I will follow up with what exactly you report again when meta automatically places youth accounts and you attest to this in the privacy and content sensitivity sessions more restrictive, and yet teens can opt out of these secure cards. Not so, they are not required to remain in these settings, they can opt out Senator, yes, we, we, by default set teens to a private account so that they have a private and restricted experience, but some teens they want to be creators and they want to have content that they share more widely and I don't think that's something that should be banned completely, why not?
I think it should be mandatory that they not stay on the most restrictive settings. Sir, I think there is, I mean. A lot of teenagers create amazing things and I think with proper supervision and nurturing and controls, I think that's true. I don't think that's the kind of thing you want to just not allow anyone to do. Do I think they want to make me run out of time? But I have to say that there is an argument that you all make for everything we are proposing and I share the concern I have about the blanket. limitation of responsibilities that we provide to all of you and I think that has to change and it is up to us in Congress to make that change thank you m Mr.
President thank you Senator Rono Senator cut Mr. President, let's get right to the point is Tik Tok under the influence of the Chinese Communist Party, no senator, we are a private company, okay, so you can see that your parents' dance is subject to the National Security Law of 2017, which requires Chinese companies to hand over information to the Chinese government and the hide the rest of the world, you admit that is correct, senator. Um, the Chinese issue, you can resolve it in advance. Any global company doing business in China has to follow its local laws, doesn't the bite dance also have an internal policy?
Chinese Communist Party Committee, as I said, all companies operating in China have developed their own local committee, so their parent company is subject to the national security law that requires it to answer to the party. It has its own internal Committee of the Chinese Communist Party. You answer to that parent company. but he expects us to believe that he is not under the influence of the Chinese Communist Party. I understand this concern, Senator, that's why we built it, it was a yes or no, okay, you used to work for Bite Dance, didn't you? CFO of Bite Dance, that's right, Senator In April 2021, while you were CFO, the Chinese Communist Party's China and Internet investment fund bought a 1% stake in the main Chinese subsidiary of Bitde Dan, the Bite Dance technology, in exchange for that so-called 1%. golden share the party took one of the three positions on the board of directors of that sub-subsidiary that is correct, isn't it for the Chinese company? appointed CEO of Tik Tok the next day, May 1, 2021, well this is a coincidence, it's a coincidence that you were the CFO, that the Chinese Communist Party took its golden share of your board seat and the next day he was appointed financial director.
CEO of Tik Tok, that's a big coincidence, he's actually a Senator, yeah, okay and before uh, bit dance, you were at a Chinese company called XI, that's right, uh, yeah, I used to work all over the world, where did you live when? you worked in xami I lived in China like many where exactly in Beijing in China how many years did you live in Beijing Senator I worked there for about five years so you lived there for five years yes, isn't it true that xiaoi was sanctioned? by the United States government in 2021 for being a communist Chinese military company I'm here to talk about Tik Tok I think I think they later had a lawsuit and it was annulled I don't remember the other Biden administration that revoked those sanctions as well as the way in that they reversed the terrorist designation on the hooti hoties in Yemen, how is that working for you?
But it was sanctioned as a Chinese communist military company, so today you said, as you often say, you live in Singapore, what nation are you from? a citizen of Singapore, are you a citizen of any other nation? No Zen, have you ever applied for Chinese citizenship? Senator. I served my nation in Singapore. No. No, I did not have a Singapore passport. Yes, and I served in the military for two and a half years. Do you have any others? Do you have any other passports from other nations? No, his wife is a US citizen. His children are American citizens.
That's right. Have you ever applied for US citizenship? No, not yet. Okay, have you ever been a member of the Chinese Communist Party Senator I'm from Singapore. No, have you ever been associated or affiliated with the Chinese Communist Party? He is not a senator again. I am from Singapore. Let me ask you some, I hope simple, questions that you said earlier in response to your question about what happened at Tenan Square. In June 1989 there was a massive protest, did anything else happen in Tenan Square? Yes, I think it is well documented that there was a massacre, there was an indiscriminate massacre of hundreds or thousands of Chinese citizens, do you agree with the Trump administration and the Biden administration that the Chinese government is committing genocide against the weakest people Senator .
I have said this before. I think it's really important that anyone who cares about this issue or any issue can freely express that it's a very simple question that unites both parties in our country and governments around the world. Senator, anyone, including you, knows you can come and I ask you: you are a worldly, cosmopolitan, well-educated man who has expressed many opinions on many topics. Is it the Chinese government that is committing genocide against the weakest? Actually, Senator, I spoke incredibly well about my compy and I am here to talk about what is or is not, you are here, you are here to give a truthful and honest and complete testimony.
Let me ask you this. Joe Biden last year said that Xiin Ping was a dictator, is he agree with Joe Biden that xiin ping a dictator? Senator. I am not going to comment on any world leader. Why don't you answer these simple questions? Senator, it is not appropriate for me, as a businessman, to enter. And world leaders, are you afraid of losing your job if you say something negative about the Chinese Communist Party? I don't agree that they will find content critical to China the next time they continue. Are you afraid that you will be arrested and disappear next time you go to Mainland China Senator I, you will find content critical of China and any other country freely on Tik Tok, okay, let's see what Tik Tock, a tool of the Party Chinese Communist, is doing to the youth of the United States.
Does the name Mason Edens ring a bell? Senator, you may have to give me more details if you don't mind. Yes, he was 16 or Canen, after a breakup in 2022, entered your platform and searched. things like inspirational and positive quotes, instead they showed him numerous videos that glamorized suicide until he killed himself with a gun. What's up with the name Chase Nasca? Sounds familiar. Would you mind giving me more details? Please, he was a 16 year old boy that he saw. More than a thousand videos on his platform about violence and suicide until he took his life by stepping in front of a plane or a train.
Do you know that his parents Dean and Michelle are suing Tick Tock and Bite Dance for pushing their son to take away his yours? own life life uh yeah, I'm aware of that, okay, finally, Mr. Chu, um, the Federal Trade Commission sued Tick Tock during the Biden administration, uh, Senator, I can't speak to whether there are any, are there any? is currently being sued by the Federal Trade Commission? Senator, I can't speak to any potential lawsuits, say actual potential, are you being sued by the Federal Trade Commission? Senator. I think I've given you my answer. No.
Miss Jino's company is being sued. I believe Mr. Zuckerberg's company is being sued. I think still Tik. Tock, the Chinese Communist Party agent, is not being sued by the Biden administration. Do you know the name Christina Kafara? Maybe you have to give me more details. Christina Kafara was a consultant paid to bite the bullet for her communist-influenced parent company. She was then. hired by Biden's FTC to advise on suing Mr. Zuckerberg's company The senator's bid is a global company and not a Chinese communist company by GL investors public reports indicate that his lobbyist visited the White House more than 40 times in 2022 How many times did your company visit your company's lobbyist visit the White House last year?
I don't know that senator, are you aware that the Biden campaign and the Democratic National Committee are on his platform? They have Tik Tock accounts. Senator, we encourage people to come. for let they don't allow their staff to use their personal phones they give them separate phones where they only use Tik Tok we encourage everyone to join including you senator so all these companies are being sued by the FTC you are not the FTC. a former paid advisor or a parent talking about how they can sue Mr. Zuckerberg's company. Joe Biden's re-election campaign. The Democratic National Committee is on its platform.
Let me ask you if you or anyone else on Tik Tock has communicated or coordinated with the Biden Administration. The Biden campaign or the Democratic National Committee to influence the flow of information on their platform that we work with, anyone, any creator that wants to use our campaign, it's the same thing, the process is fine, so what we have here is a company that is a tool of the Chinese Communist Party that is poisoning the minds of American children, in some cases driving them to suicide, and that the Biden administration is overlooking at best, at worst , you can be in collaboration with thank you Mr.
Chu, thank you Senator Cotton, so we are we are going to take a break now that we are in the second roll call, members can take advantage of that they want the break to last about 10 minutes, please do the best they can to return. The Senate Judiciary Committee will resume. We have nine senators who have not asked. questions still in seven minute rounds and we will address Senator Pia first, thank you, Mr. President, colleagues, as we reconvene, I am proud once again to share that I am one of the few senders with younger children, and I . start with that because as we have this conversation today, it's not lost on me that between my kids, who are now in the teen and tween category, and their friends, I see this issue very up close and personal, um and in that spirit, I want to take a second to recognize and thank all the parents in the audience today, many of whom have shared their stories with our offices and I credit them for finding strength through their suffering through their struggle and channeling. .
That's in defense that is making the difference. I thank all of you. I now have a new personal appreciation for the challenges parents and caregivers face in helping our young people navigate this world of social media and technology in general. Now the services our children are growing up with give them unparalleled access to information. I mean, this is beyond what previous generations had. experience and that includes opportunities for learning, socialization and much more, but we also clearly have a lot of work to do to better protect our children from predators and the predatory behavior that these technologies have enabled and yes, sirZuckerberg, that includes exacerbating the mental health crisis in In America, almost every teenager we know has access to smartphones and the Internet and uses the Internet daily, and while the Guardians have the primary responsibility of caring for our children, the old saying goes that it takes a village and therefore society as a whole, including leaders. in the tech industry we must prioritize the health and safety of our children now dig deeper into my questions now be specific platform by platform Witness by witness on the topic of some of the parenting tools that each of you have referenced Mr. cite how many miners there are on Discord and how many of them have caregivers who have adopted their family center tool and if you don't have the numbers, say so quickly and send them to our office.
We can follow up with you on how you have ensured that you are young. Your Guardians are aware of the tools you offer. We make it very clear that teens on our platform should use what tools are available and our teen safety is enabled by What specifically do you do? What may be clear to you is not clear to the general public, so what do you do in your opinion to make it very clear so that our teen safety assist, which is a feature that helps teens stay safe , in addition to blocking and blurring images that may be sent to them?
It is enabled by default on 14 accounts and cannot be disabled. We also have marketing and for our teenage users directly on our platform we launch our family center. We created a promotional video that we put directly into our product so that when Teen um opens. In fact, the app, every user opened the app and got an alert like, Hey, Discord has this. They want you to use it. Thank you. We look forward to the data we are requesting for Mr. Zuckerberg across all Instagram, Facebook Messenger meta-services. and Horizon, how many children use your applications and of those children how many have a caregiver who has adopted the parental supervision tools that you offer?
Sorry, I can go on with specific statistics on that. Well, it would be very useful not only for us. I know, but just so you know as a leader of your company, and the same question, how do you make sure that young people and your gardeners are aware of the tools that you offer? We carry out quite extensive advertising campaigns both on our platforms and outside of them. We work with creators and organizations like Girl Scouts to make sure that this is, generally speaking, there's a broad understanding of the tools, okay? Mr. Spel, how many minors use Snapchat and of those minors, how many have caregivers registered at their family center?
Senator, I think. There are approximately 20 million teenage Snapchat users in the United States. I think about 200,000 parents use Family Center and about 400,000 teens have linked their account to their parents using Family Center, so 200,000 400,000 sounds like a big number but small in percentage. of minors using SnapChat, what are you doing to ensure that the young people in your Guardians are aware of the tools you offer? Senator, we created a banner for Family Center in the user profile so that account, we believe, is as old as they might be. Parents can easily see the entry point to the Family Center, okay Mr.
Chu, how many minors are there on Tik Tok and how many of them have a caregiver who uses their family tools. Senator, I need to get back to you about the specific numbers. but we were one of the first platforms to provide what we call family matching. For parents, you go to settings, you activate a QR code, your teen's QR code and yours, you scan it, and what it allows you to do is set up screen time. limits, you can filter some keywords, you can activate a more restricted mode and we are always talking to parents.
I met with a group of parents, teens, and high school teachers last week to talk about what else we can offer. In family matching mode, miss yako, how many minors use x? Are you planning to implement safety measures or guidance for caregivers like uh your pure companies appreciate it, Senator? Less than 1% of all users are between 13 and 17 years younger. More than 1% of the 990 million American users are fine, so still hundreds of thousands continue yes, yes, and each one is very important. As a 14-month-old company, we have reprioritized safety and child protection measures and are just getting started. to talk and discuss how we can improve those with parental controls, okay, let me continue with the follow-up question for Mr.
Citron, in addition to keeping parents informed about the nature of various Internet services, there is obviously much more that we must do for today's times. purposes, while many companies offer a wide range of quote-unquote user empowerment tools, it's helpful to understand whether young people find these tools useful, so I appreciate you sharing your teen safety assistance and tools and how you advertise them , but have you carried out any evaluation? uh, how these features are impacting miners' use of their platform. um. Our intention is to provide teens with tool capabilities that they can use to keep themselves safe and also so that our teams can help keep teens safe. um.
We recently launched teen safety assistance last year. and we, I don't have a study in mind, but we'd be happy to follow up with you, okay, I'm out of time. I will have follow-up questions for each of you in the second round. or by statements for the record about a similar evaluation of the tools that you have proposed thank you, Mr. President, thank you Senator Pia, Senator Kennedy, thank you all for being here, uh, Mr. Spiegel, I see you hiding down there, what does it mean yada yada and? I'm not familiar with the term Senator, very unappealing, can we agree that what you do, not what you say, what you do, is what you believe and everything else is just cottage cheese?
Yes, senator, you agree with that, speak up, don't be shy. I have listened to you today I have listened to many yachts Y ying and I have heard you talk about the reforms that you have done and I thank you for them and I have heard you talk about the reform that you are going to have done but I do not think that you are going to solve the problem. I think Congress will have to help you. I think the reforms you're talking about to some extent will be like putting paint on rotten wood and I'm not sure you're going to support this legislation.
I'm not. The fact is that you and some of your Internet colleagues who are not here are no longer here. They are not companies. Their countries. You are very, very powerful and you and some of your colleagues who are not here have blocked everything that we have tried to do in terms of reasonable regulation, from privacy to child exploitation and, in fact, we have a new definition of recession, um . recession is when we know we're in a recession when Google has to fire 25 members of Congress, that's what we've come down to, we've also come down to this fact that their platforms are harming children.
I'm not saying that they are not doing some good things but they are hurting children and I know how to count the votes and if this bill makes it to the floor of the United States Senate it will pass what we are going to have to do and I say that with all the respect I can muster is convinced that my good friend, Senator Schumer, goes to Amazon, buys a tenderloin online and brings this bill to the Senate and, uh, the House will pass it. That's one person's opinion. I may be wrong, but I doubt it, Mr.
Zuckerberg, let me ask you a couple of questions. I may be getting a little philosophical here, I have to give it to you. They have, they've convinced over two billion people to give up all their personal information every so often in exchange for being able to see what their school friends had for dinner on Saturday night, that's pretty much their business model. , It is not like this? That's not how I would characterize it and we give people the chance to connect with the people they care about. and um and to engage with the topics that they're interested in and you and you take this information, this is a lot of personal information and then you develop algorithms to push the hot buttons of people that send them and target information that pushes their hot buttons. again. and again and again to keep them coming back and staying longer and as a result their users only see one side of a problem and to some extent their platform has become a slaughterhouse.
For the truth, isn't it? I mean the Senator. I don't agree with that characterization. You know, we create rankings and recommendations because people have a lot of friends and a lot of interests and they want to make sure they see content that's relevant to them. We are trying to create a product. that's useful for people and making our services as useful as possible for people to connect with the people they care about and the interests they care about, but it doesn't show both sides, it doesn't give them balanced information, it just keeps pushing By pushing your hot buttons, you don't show them balanced information so that people can discern the truth for themselves and you speed them up so much that very often your platform and others become puddles of sarcasm where no one learns anything.
Aren't they okay, senator? I do not agree with that. I think people can get involved in the things that interest them and learn quite a bit about those that we've done in a handful of different experiments and things in the past around the news and trying to show. content about you knows a diverse set of perspectives. I think there's more that needs to be explored there, but I don't think we can figure that out on our own. One of the things I saw. Sorry to interrupt you, Mr. Mr. President, I'm going to run out of time.
Do you think your users really understand what they're giving you, all their personal information and how you process it and how you monetize it? Do you think people really understand? Senator, I think people understand the basic terms. I mean, I think there is. In fact, I think a lot of people put another one. It's been a couple of years since we talked about this. Do your user agreements still suck? I'm not sure how to answer that, Senator, can you still have a dead body and all that legal ease where no one can find it? Senator, no, I'm not really sure what you mean, but I think people understand the basic deal of By using these Services, it's a free service that you use to connect with the people you care about.
If you share something with other people, other people will be able to see your information. It's something inherent that you know if you're publishing something. shared publicly, um or with a private group of people, you know inherently you're posting it, so I think people understand that, but Mr. Zuckerberg, you're in the hills of creepy, you're trying to track, track people who don't They are. Not even Facebook users track their own people, their own users, who are their product even when they are not on Facebook. I mean, I'm going to land this plane pretty quickly, Mr. President.
I mean, it's creepy and I get it. Make a lot of money doing it, but I wonder if our technology is greater than our Humanity. I mean, let me ask you this last question. Instagram is harmful to young people, isn't it? Senator. I don't agree with that, that's not what the research shows. In general, that doesn't mean that individual people don't have problems and that there aren't things that we need to do to help provide the right tools for people, but in all the research that we've done internally I mean. This is the survey that the senator cited previously.
You know, there are 12 or 15 different categories of harm that we asked teens if they felt like Instagram made it worse or better and in all of them, except that one. that um, that Senator Holly quoted um, more people said that using Instagram has problems that they face, whether it's positive or let me, we're just going to have to agree to disagree if you believe in Instagram, I know, I'm not saying that be intentional, but if you agree that Instagram, if you believe that Instagram is not harming millions of our young people, particularly young teenagers, particularly young women, you should not drive, it's thank you, Senator Butler, thank you, Mr.
President, and thank you to our panelists who have come to uh. have an important conversation with us, most importantly I want to thank the families who have come forward to continue to be remarkable champions for their children and their loved ones for being here and, in particular, two families from California that I was able to Just talk with the families of Sammy Chapman of Los Angeles and Daniel Perta of Santa Clarita, are here today and doing incredible work to not only protect the memory and legacy of their children,but also the The work that they are doing is going to protect my nine-year-old son and that is, in fact, the reason we are here.
There are a couple of questions I want to ask a few people, let me start with a question for each of you. Mr. Citron, have you ever sat down with a family and talked about their experience and what they need from your product? Yes or no. Yes, I have talked to parents about how we can create tools to help them. Mr. Spiegel, have you sat down with families and youth? people to talk about your products and what they need from your product, yes, Senator, Mr. Chu, yes, I did it two weeks ago, for example, I don't want to know what you did for the preparation of the hearing, Mr.
Chu, I just wanted to know something in terms of how Prov designed the product that you are creating to Mr. Zuckerberg, have you sat down with parents and young people to talk about how you design the product, for, its constitution for your consumers? Yes, over the years, I've had many conversations with parents, you know, that's interesting, Mr. Zuckerberg because we talked about this last night and you gave me a very different answer. I asked him this same question. Well, I told him no, I didn't know what specific processes our company had. No, Mr. Zuber, you told me that.
If not, I must have been wrong. I want to give you a chance to trash talk, Mr. Zuckerberg, but I asked you this very question. I asked everyone this question, uh, and you gave me a very different answer when we spoke, but I'm not going to dwell on it. Can? Several of you have talked about Sorry Ex. Miss Sharino, have you spoken directly to parents and youth about your product design? As the new leader of X. Those who respond. Yes, I have told you about. behavioral patterns because less than 1% of our users are in that age group, but yes, I have spoken to them, thank you ma'am, Mr.
Spiegel. There are several parents whose children have been able to access illegal drugs on your platform, what do you say to those parents? Well, senator, we are devastated because we can't. To the parents, what do you say to those parents? Mr. Spiegel. I am very sorry that we were not able to prevent these tragedies. We worked very hard to block everything. Drug-related search terms on our platform we proactively search for and detect content classified as drugs, remove it from our platform, retain it as evidence, and then refer it to authorities. uh for Action, we have worked together with non-profit organizations and with families on educational campaigns because the scale of the fenal epidemic is extraordinary more than 100,000 people lost their lives last year and we believe that people need to know that a pill can kill that campaign reached over 200 it was viewed over 260 million times on Snapchat us too Mr.
President there are two parents in this room who lost their children they are 16 years old their children were able to get those bills from Snapchat I know there are statistics and I know that there are good efforts, none of those efforts prevent our children from getting access to those medications on your platform, as a California company, y'all. I've talked to you about what it means to be a good neighbor and what California families and American families should expect from you. They owe you more than just a set of statistics uh and I look forward to all of you showing up in all parts of this legislation to keep our children safe Mr.
Zuckerberg I want to get back to you. I talked to you about being a father of a young child, um, who doesn't have a phone, don't you know, he's not on social media at all, um, and one of the things that worries me deeply, as a father of a young black, is the use. of filters on their platform that would suggest to young women who use their platform evidence that they are not good enough as they are. I want to ask more specifically and refer to some unredacted documents that reveal that their own researchers concluded that these plastic surgery-mimicking face filters negatively impact young people's mental health, in fact, and well-being, so why should we believe that?
Because you're going to do more to protect young women and girls when you give them the tools to affirm the self-hatred that's spewed on their platforms, why should we believe you're committed to doing anything more to keep our children safe? ? Sorry, there's a lot to unpack, there are tools to express yourself in different ways and people use their faces. filters and different tools to create media, photos and videos that are fun or interesting in many of the different products that are plastic surgery pins are good tools to express creativity, senator, I'm not talking about those skin lightening tools.
They are tools to express creativity, this is what I ask directly. I'm not advocating any specific ones. I think the ability to filter and edit images is generally a useful expression tool for that specifically. I'm not familiar with the studio you're referring to, but we made it so that we wouldn't recommend this type of content to teams. I made no reference to a study for court documents that revealed your knowledge of the impact of these types of filters on young people in general, girls in particular. I don't agree with that characterization. I think there have been exaggerated documents.
I haven't seen any documents. Okay, Mr. Zuckerberg, my time is up, I hope you are. Listen to what's on offer and we're ready to step up and do better. I know that this Senate committee is going to do our job to take this into account. Thank you, Mr. President, Senator Tillis, thank you, Mr. President, thank you all for being here, um, I don't feel like I'm going to have the opportunity to ask many questions, so I'll reserve the right to send some for the record, but I have heard that we have had hearings like this. Before I was in the Senate for nine years, I heard hearings like this, before I heard horrible stories about people who died, committed suicide, I was embarrassed, every year we have an annual flogging and what has materially happened in the last nine years, one of you?
I just. No question, are any of you involved in an industry consortium trying to make this fundamentally secure across all platforms? Yes or no, Mr. Zuber, there are a variety of organizations that participate, and which organizations you should participate in. Does anyone here not participate in an industry? Yes I really think it would be a moral for all of you to consider as a strategic advantage to stay safe or keep private something that would protect all of these platforms to avoid the kind of professional that you all do. I agree that anyone would say they want ours because ours is the most secure and they haven't discovered the secret, so I want you as an industry to realize that this is an essential threat to all of you if you don't we do it well. right, I mean, you have to protect your platforms, you have to deal with this, don't you have an inherent mandate to do this because it seems to me that if you don't, you will cease to exist?
I mean we could regulate it out of business if we wanted to and the reason I say it may seem like a criticism, it's not a criticism. I think we have to understand that there should be an inherent motivation for you to do this well. Congress will make a decision that could potentially put you out of business. This is why I'm worried about that, although I went online while listening intently to all the other members talking and found a dozen different platforms out there. United States, 10 of which are in China, two of which are in Russia.
Their average daily subscription or active members runs into billions. Well, people say that the Chinese version of Tik Tok cannot be accessed. I did a quick search on my favorite search engine to find out exactly how I can get an account on this platform today, um, and the other thing we need to keep in mind. I come from technology. You could figure out, ladies and gentlemen, how to influence. his son without him ever being on a social media platform. I can send random text messages and get something to eat and then find an email address and get compromising information, um, if we are, it's horrible to hear some of these stories and I've shared them.
We and I have had these stories in my hometown in North Carolina, but if we just come here and make a point today and don't start focusing on making a difference, which requires people to stop yelling and start listening and to pass the language. bad actors here are simply going to be off our shores. I have another question for all of you, how much do you do? Approximately how many people? If you don't know the exact numbers, okay, approximately how many people do you have watching these 24 hours a day? horrible images and just go very quickly with a response later and filter it, um, it's the majority of the 40,000 people who work in security and again we have 2,300 people around the world.
Well, we have 40,000 trusted and safe professionals around the world. In the world we have approximately 2,000 people dedicated to trust, security and content moderation. Our platform is much smaller than these people. We have hundreds of people and we are looking at content in 50% of our work. I have mentioned that these people have a horrible job that many of them experience, they have to seek counseling for all the things they see, we have evil people out there and we are not going to solve this by yelling or talking to each other, we are going to solve this. for all of you to be at the table and hopefully come closer to what I heard say from a person who supports a lot of good bills, like one I hope Senator Blackburn brings up when she gets a chance talk, but guys, if you're not at the table to secure these platforms, you're in it and the reason I don't agree with that is that if we ultimately destroy your ability to create value and put you out of business, the evil people will find another way to get to these children and I have to admit that I don't think my mom is seeing this, but there is something good that we can't ignore, what is happening with my mom who lives in Nashville, Tennessee, and I talked yesterday and we talked about a Facebook post he made a couple of days ago.
We don't let her talk to anyone else about that thing that connects my 92-year-old mother with her grandchildren and great-grandchildren and that allows a child who may feel uncomfortable in school to join a group of people and interact with people, let's not rule out what Good because we have not all focused together on eradicating the bad. Now I guarantee you that I could go through some of your governing documents and find a reason to blame each of you for not putting emphasis on it, I think you should, but at the end of the day, I find it hard to believe that any of you started this business, some of you in your college dorm. rooms for the purpose of creating the evil that is being perpetrated on their platforms, but I hope that every waking hour that they are doing everything they can to reduce it they will not be able to eliminate it and I hope that there are some entrepreneurial young technology people that there are today will go to the parents and they will tell you, ladies and gentlemen, you are children, you have a deadly weapon, you have a potentially deadly weapon, whether it's a phone or a tablet, you have to secure it, you can't.
I guess they are going to be honest and say they are 16 when they are 12. We all have to recognize that we have a responsibility to play and you are at the tip of the spear, so I hope we can. get to a point where we're moving these bills if you have a problem with them State your problem let's fix it no it's not an answer uh and know that I want America to be the beacon for innovation be the beacon for security and to prevent people from using other options that have been around since the Internet has existed to exploit people and count on me as someone who will try to help thank you, Mr.
President, thank you Senator Tillis, next is the senator, thank you, Mr. President, and uh. Thanks to our witnesses today, Mr. Zuckerberg, I want to start by just asking a simple question: Do you want kids to use your platform more or less well? We do not want people under 13 years of age to use it. You want teenagers 13 years old and to use your platform more or less um well, we would like to create a product that is useful and that people want to use more. My time is going to be limited, so do you want them to use it more or less?
Adolescents 13 to 17 years old you want them to use metaproducts more or less I would like them to be useful enough that they want to use them more you want them to use them more I think we have one of the fundamental challenges here in fact, you have a fiduciary obligation not to try kids using your platform more, depends on how you define it, um, we obviously are a business, um, but that's the way it is. It's Sr., Mr. Zuckerberg, it's just our time. It is not evident that you have a fiduciary obligation to ensure that your users, including those under 18 years of age, use and interact with your platform in a mannermore correct in the long term, but in the short term we often take many actions, including ourselves.
We made a change to show less video than on the platform, which reduced the amount of time by more than 50 million hours, but if your shareholders ask you, Mark, I wouldn't talk to Mr. Zuckerberg, but your shareholders may call each other by name. Mark, are you trying to get kids to use metaproducts more or less? I would say rather, well, I would say that in the long term we are trying TR as much as possible, let's look at the 10K that it presents to the SEC some things. I want to point out that here are some quotes and this is a presentation that you sign correctly, yes, our financial performance has been and will continue to be significantly determined by our success in retaining and engaging active users.
Here's another quote if our users lower their level. of commitment to our products, our revenues, financial results and business may be materially harmed. Here is another quote that we believe some users, especially younger ones, are aware of and actively engaged in other products and services. The similar journey as a substitute for ours continues in the event that users increasingly interact with other products and services, we may experience a decline in usage and engagement in key demographic groups or, more generally, in which case our business will likely would be harmed. You, as CEO, have an obligation to encourage your team to get kids to use your platform more.
Senator F, this is me, it's not obvious that you have a fiduciary obligation to your shareholders to get kids to use your platform more. I think what's not intuitive is that the direction is to make the products more useful so that the way people want to use them the most we don't give our team that runs the Instagram feed or the Facebook feed a goal to increase the amount of time people spend, yes, but you don't argue and you and your 10K make it clear. I want your users to engage more and use the platform more and I think this gets to the root of the challenge because it's the overwhelming view of the public, certainly in my home state of Georgia, and we've had some discussions about the underlying science of this. platform is harmful to children.
I mean, you're familiar with it and not just its platform, because of the way social media in general. The Surgeon General's 2023 report on the impact of social media on children's mental health, which cited evidence that children who spend more than three hours a day on social media have twice the risk of poor health outcomes. mental health, including depression and anxiety. Are you familiar with the Surgeon General's report on the underlying study? I read the report, yes, you question it, no, but I think it's important to characterize it correctly, I think. What I was pointing out in the report is that there seems to be a correlation and obviously the mental health issue is a very important one so it is something that needs to be looked at.
The thing is, everyone knows there is a correlation, everyone knows that kids who spend too much time on their platforms are at risk and it's not just mental health issues. I mean, let me ask you another question: is your platform safe for children? I think it is, but there is an important difference between correlation and causation because we are not going to be able to get anywhere, we want to work productively, openly, honestly and collaboratively with the private sector to pass legislation that protects Americans, that protects American children above all and allows businesses to thrive in this country.
If we don't start with an open, honest, sincere and realistic assessment of the problems, we can't do it. The first point is that you want children to use the platform more; In fact, you have the obligation to do so, but if you are not willing to recognize it. is a dangerous place for children The internet is a dangerous place for children, not just your platform, isn't the internet a dangerous place for children? I think it can be. Yes, there are wonderful things people can do and there are harms too. that we have to work for ourselves, it's a dangerous place for children, there are families here that have lost their children, there are families on the other side of the Bloomberg poll, uh, of the swing state, so there is no doubt that winning The primaries have probably boosted Donald Trump in the eyes of followers and people who are interested in seeing him as an option for president, it seems that they are migrating towards him and we have talked about this on the show before, nothing better than beating someone else to seem more politically viable. uh Biden hasn't had that opportunity, no one is going to think twice about his primary, so he has to suck it up, but I would say something big was announced, we talked about money earlier on this show, uh, a Super PAC from Biden. putting $250 million into a media buy starting in August after their convention, it will extend into the election, which is a historic level of media buying in a general election and Donald Trump is going to have to find a way to be able to compete with those guys. of attacks when it comes to general elections.
I'm glad you mentioned that Rick, $250 million genius, I don't even know how it's booked, uh, we've learned from so many announcements here in the last 24 hours and Bloomberg reports that $50 million has now been spent. by Donald Trump super Pacs on legal fees throughout 2023, how can he match Joe Biden in ad spending if he has to pay his lawyers so much? Well, it's going to be very difficult and, of course, we know that there's a lot more to come, he's going to have to pay lawyers and you know, just when we were talking about the disadvantages of being Joe Biden's incumbent, he's responsible for the border, which The good thing is that he has not had a primary challenger that he has had to spend. a lot of money against, so all that money they have they are spending or hoping to spend and they have a lot, so it is a benefit for him, there is another benefit in the Bloomberg morning poll, the number of people who say that.
Democracy is one of the most important issues, it obtained 87% immigration rights. I don't know if he'll be there in November, but that's something Joe Biden is into even with Donald Trump, so that's something they're going to do. We'll talk a ton as we go, of course, one of our big takeaways as we get closer to our special coverage of today's Federal Reserve meeting. Rick Davis is that more than half of swing state voters would not vote for Donald Trump if he were further convicted. 58% say they would not go there if he were sentenced to prison. I guess it's no wonder he's diverting 10% of donations to the Super PAC to pay his legal bills.
Yeah, look, I mean this is an existential issue for him in his campaign. That's why. is working very hard to postpone all these trials until after the election is over, but look, I mean, it's going to cost me even more money, the 50 million that we described and that he already spent was 2023 money, all these appeals, all these new ones. The cases he has to prepare for and execute are going to last all the way through 2024 until the election, which will mean stressing his cash on hand with these super packages, and of course that makes it even harder to compete with them. huge amounts that the Democrats are putting to work, I mean, Biden has 117 million cash on hand at the end of the year and you compare that to things like the RNC, which is bankrupt, you know, you really wonder how the Republican is doing.
The machine will be able to be financially resurrected to compete with these big numbers. These are big questions that we will try to answer over the course of this cycle and we will have more information on the FEC filings. Fundraising totals will be released. later today when we reconvene to balance power at 5:00 p.m. Washington time here on Bloomberg TV and radio many thanks to Rick Davis and Jeie Shan Bloomberg watch the FED decides starts right now Access Financial World On Demand Hear from top economists, policymakers and industry experts via live webinars and On Demand only from Bloomberg Start exploring to see what's on moving markets visit Bloomberg.com webinars Are you serious? letting him go back to a show is what will happen, there will be some t b live from New York City for our audience around the world, the FED decides it starts right now, there is a special edition of Bloomberg Watch with Tom Keane, Jonathan Faroh and Lisa Bloomberg, Fed Watch. from live on blimber TV and radio good afternoon, good afternoon, counting on you for another Federal Reserve rate decision in the pound president's press conference along with the drum roll Lisa rabitz Jonathan pharaoh TK returns to the center seat from New York City with about 28 minutes left TK until fed decision what are you looking for?
That I am looking for? I think it's John and Lisa Deide. I mean, that's what we're going to call the show. Today what I'm going to look for is a script change from about 10:00 a.m. m. This morning we saw some economic data, but a surprise from the bank. Lisa before going on air was talking about the joy here at NYCB and I got a real rate that is in jump condition towards the PRI. Misra. Things had disappeared. I have to say TK New York Community. The bank wasn't on my bingo card for Fed decision day, but brother, that seems to be driving this market going into this call.
What's interesting is that there is a community bank that purchased some of Signature Bank's assets. We can go into all the details at the end. The line is this: many regional banks have come to program after program saying that we are healthy and suddenly a company appears saying that they have losses and that they are going to cut the dividend and suddenly people are pricing in a rate cut with greater certainty. in March and these are the movements in the bond market, let's go directly to the Treasury with the following configuration: we are down 10 basis points in a 2-year and 10-year yield, a minimum of my seven basis points Tom 395 in a 10-year bond, yeah, all the way, I mean, and again it was a speed here, the ECI showed a little bit of that wage growth, maybe today, but then the banking shock and the bond market puts us where Here every there is more and more importance, not so much as a statement, but this press conference today with a 4% 10-year yield has a new importance, it raises the question Tom, the stress for a particular bank, Lisa, a reason to buy this bond market or an excuse.
Maybe it's an excuse for the FED to cut rates because that might be what they want to do anyway and I think that might be what people are leaning towards as to whether it's an excuse or a reason. I would suggest that maybe it's an excuse based on some of what we've heard says there are still a lot of concerns about commercial real estate. Let's get someone better than me talking about this. Coming up, Priia Mizra of JP Morgan and Matt LTI of Deutsche Bank will take us to the 2m rate decision. Then the immediate reaction from former Fed Vice Chairman Richard Clara, Diane Swank P Jim from kpmg, Robert tip and Michael gapen from Bank of America and finally this is the main show right at Chairman Powell's press conference and We're going to talk about all the nuances and what In fact, he said that with former New York president Bill Dudley and the Black Rocks Jeff Rosenberg, a fantastic lineup for something that people don't expect much, but this became much more interesting with New York, uh, Community Bank or a fantastic postgame show, great pregame show. well let's start with prya misra from JP Morgan Asset Management, sometimes guests ask the best questions themselves.
Public relations. You've made that how outdated the December dot plot is. That's the question you asked. How outdated is it? You know, I would say it's a little bit outdated, inflation has continued to decline if you look at the labor market ECI today, so I think what the FED might have expected in December is that inflation continues to decline, it has continued to decline and I I would say that parts of the labor market are also moderating, so I would say that when they talked about three cuts in December, well, maybe the new dot chart, if they did that and didn't release a dot chart today, that number It could be four or five.
I think the FED now needs to talk clearly about policy normalization and not easing; that is the change. I'll be keeping an eye out because they need to remove that tightening bias, so I think it's a little outdated, but let's see if Chile does that, but at least talk about the fact that the FED should be ready to normalize the QT taper, so like the rate cut front, opens that door over time,wait for the data to push them towards March or May. The focus changed this morning this afternoon, all we're talking about is New York Community Bank PR with that in mind do you think it's a good reason to buy this bond market today or just a good excuse?
I think it's an excuse. I mean, the whole Regional Bank thing just tells you the delays. Remember we talked. As for the delays of the last two years, I want to say that it has been going on for a while, there was a blockade, indeed, interest rates matter, but they matter very differently for different parts of the world.

If you have any copyright issue, please Contact