YTread Logo
YTread Logo

JUST IN: Mark Zuckerberg, Social Media CEOs Face Epic Grilling By Senate Judiciary Committee | Full

Mar 08, 2024
In 2013, the National Center for Missing and Exploited Children known as NickM received approximately 1,380 cyber reports per day by 2023,

just

10 years later, the number of cyber reports increased to 100,000 reports per day, that is, 100,000 daily reports of material from also known child sexual abuse. As they cease, in recent years we have also seen an explosion in so-called financial sexual torture, in which a predator uses a fake

social

media

account to trick a minor into sending explicit photos or videos and then threatens to post them to unless the victim sends money. in 2021 Nick Mech received a total of 139 reports of sexual torture 2021 in 2023 until the end of October alone this number skyrocketed to more than 20 22,000 more than a dozen children have died by suicide after becoming victims of this disturbing crime growth in Child sexual exploitation is driven by one thing: changes in technology In 1996, the world's best-selling cell phone was Motorola startech's motto, although it was innovative at the time, the clamshell-style cell phone was not much different from A traditional telephone, it allowed users to make and receive calls. and even receive text messages, but that was it, today smartphones are in the pockets of seemingly every man, woman and teenager on the planet, like the star Tac, today's smartphones allow Users make and receive calls and text messages, but can also receive photos and videos, support live streaming and offer countless applications with the touch of a finger.
just in mark zuckerberg social media ceos face epic grilling by senate judiciary committee full
That smartphone that can entertain and inform you can become a dead end where your children's lives are damaged and destroyed. These applications have changed the way we live, work and play. But as research has detailed,

social

media

and messaging apps have also given predators powerful new tools to sexually exploit children. Their care

full

y crafted algorithms may be a more powerful force in the lives of our children than even the most well-intentioned parent, Discord, has been used to hijack. Instagram helped connect and promote a network of pedophiles. Disappearing Snapchat messages have been co-opted by criminals who financially extort young victims.
just in mark zuckerberg social media ceos face epic grilling by senate judiciary committee full

More Interesting Facts About,

just in mark zuckerberg social media ceos face epic grilling by senate judiciary committee full...

Tick ​​has become a preferred dating platform for predators to access involvement and groom children for abuse and the prevalence of cam on X has grown as the company has destroyed their trust and safety. Workforce today we will hear from the CEOs of those companies. It is not

just

technology companies that have contributed to this crisis, they are responsible for many of the dangers our children

face

. online, their design choices, their failures to adequately invest in trust and security, their constant pursuit of compromise and profit over basic security, have put our children and grandchildren at risk. Coincidentally, several of these companies implemented common-sense child safety improvements in the final days of the previous week.
just in mark zuckerberg social media ceos face epic grilling by senate judiciary committee full
Their CEOs would have to justify their lack of action to this

committee

, but the tech industry alone is not to blame for the situation we find ourselves in, those of us in Congress should look in the mirror at 1996, the same year in which Motorola's startech was flying. off the shelves and years before social media went mainstream, we passed Section 230 of the Communications Decency Act. This law immunized the then-nascent Internet platforms from liability for user-generated content. Interesting, only one other industry in the United States has immunity from civil liability. We'll leave that. One more day for the past 30 years, Section 230 has remained virtually unchanged, allowing Big Tech to grow into the most profitable industry in the history of capitalism without fear of being liable for unsafe practices that have to change over time. last year, this

committee

reported unanimously. five bills that would finally hold tech companies accountable for child sexual exploitation on their platforms.
just in mark zuckerberg social media ceos face epic grilling by senate judiciary committee full
Take a unanimous look at the composition and members of the Senate Judiciary Committee and imagine, if you will, that there is something we could unanimously agree on. These five bills were the subject of agreement, one of these bills is to stop acting critically, it would allow victims to sue online providers who promote or aid and abet online child sexual exploitation or who They house or store cesam. This stance against online child sexual exploitation is bipartisan and absolutely necessary. Let this hearing be a call to action that we need to get child online safety legislation to the president's desk.
I now turn to the ranking member, Senator Graham, thank you, Mr. President, the Republicans will answer the call, all of us, every one of us is ready to work. you and our Democratic colleagues on this committee to show the American people that while Washington is certainly broken, there is a glimmer of hope and this is where it resides with your children after years of working on this issue with you and others. I came to the conclusion. The following social media companies, as currently designed and operated, are dangerous products, are destroying lives and threatening democracy itself. These companies must be controlled or the worst is yet to come.
Gavin Guffy is a Republican representative from South Carolina on the Rock. Hill area to all the victims who came and showed us photos of your loved ones, don't give up, you are working, you are making a difference through you, we will get to where we need to go so other people don't have to show a photo of his family, the damage to his family has already been done. Hope

full

y we can take your pain and turn it into something positive so no one else has to hold up a sign. Gavin Jr. connected to Instagram and was tricked by a group in Nigeria that posted a young girl posing as his girlfriend and as things go at that stage of life he gave her some compromising sexual photos and it turned out that she was part of an extortion group in Nigeria threatened the young man that if you didn't do it, give us money we will expose these photos he gave them money but it wasn't enough they kept threatening and he committed suicide they threatened Mr.
Guffy and a son these are bastards by any known definition uh Mr. Zuckerberg you and the companies before us I know you don't want it to be like that but you have blood on your hands you have a product you have a product that is killing people when we had cigarettes killing people we did something about it maybe not enough you are going to talk about guns , we have the ATF, there is nothing here, no one can do anything about it, they can't sue him now. Senator Blumenthal and Blackburn, who have been like the dynamic duo here, found emails from his company warning him about this. stuff and you decided not to hire 45 people who could do a better job policing this, so the bottom line is you can't be sued, you should be, and these emails would be great for punitive damages, but the courts shut down all the abused Americans for everyone. the companies in front of me, of all the people in America, we could provide general liability protection to this would be the last group I would choose.
Now is the time to repeal section 230. This committee has assembled the most ideologically diverse people you can find. We have come together thanks to your leadership, Mr. President, to pass five bills to address the problem of child exploitation. I'll talk about them in depth in a moment. The bottom line is that all of these bills have met the same fate and are going nowhere. They leave the committee and die. Now there is another approach: what is done with dangerous products? or allows lawsuits, has legal protections to protect consumers or has some kind of commission to regulate the industry in question and take away your license if you have a license for yourself, none of that exists here, we live in America in 2024, where no There is a regulatory body that deals with the largest and most profitable companies in the history of the world, they cannot be sued and there is not a single law in the book that is meaningful. protect the American consumer Other than that, we're in a good situation, so here's what I think is going to happen.
I think after this hearing today we're going to put a lot of pressure on our fellow Democratic-Republican Senate leaders to allow these bills to come to the floor and be voted on and I'm going to come down in a couple of weeks to make a request unanimous uh request for unanimous consent to do cesam do the act of winning it do your bill do all the bills and you can be famous object of coming I'm going to give you a chance to be famous now that Elizabeth Warren and Lindsey Graham don't have almost nothing in common, I promised him.
I would say publicly the only thing worse than me doing a bill with Elizabeth Warren is her doing a bill with me. In some ways we part ways because Elizabeth and I see an abuse here that needs to be addressed with Senator Durán and I have different political philosophies, but I appreciate what you have done on this committee; He has been a great partner to all my Democrats. Colleagues, thank you very much to my Republican colleagues, thank you very much everyone, save the applause for when we get a result, for now it's all talk, but one day they will come if we keep pushing for the right answer for the American people.
What is that answer? Responsibility now these products have an advantage. You have enriched our lives in many ways. Mr. Zuckerberg, you created a product. I use the idea. I think when it first occurs to you, you will be able to talk to your friends and family. and spend your life so you can have a place where you can talk to your friends and family about the good things that happen in life and I use it, we all use it, everything has its advantages here, but the dark side does not. It's been taken care of, now it's time to deal with the dark side because people took your idea and turned it into a nightmare for the American people, turned it into a nightmare for the world at large.
Tick ​​Tock, we had a big discussion about how maybe Larry Ellison at Oracle can protect American data from Chinese communist influence, but Tick Tock, their representative Israel, left the company because Tik Tock is being used in a way to basically destroy the Jewish state. This is not just about individuals. I am concerned that in 2024 our Democrat would be attacked again through these platforms by foreign actors, we are exposed and AI is just getting started so to my colleagues we are here for a reason this committee has a history of being tough but also to do things that must be done, this committee has risen to the occasion, there is more we can do, but to the members of this committee we insist that our colleagues rise to the occasion.
Let's also make sure that in the 118th Congress we have votes that will fix this problem, whatever they can do. You can cast your vote at the end of the day, but you can prompt the system to require others to cast their vote. Mr. Chairman, I will continue to work with you and all members of this committee to have a day of reckoning on the floor of the United States. Senate, thank you, thank you, Senator Graham, today we welcome five witnesses who I will now introduce Jason Citron, the CEO of Discord Incorporated Mark Zuckerberg, the founder and CEO of meta Evan Spiegel, the co-founder and CEO of snap Incorporated , show Chu the CEO of Tik Tock and Linda Yako, CEO of the former corporation formerly known as Twitter.
I will note for the record that Mr. Zuckerberg and Mr. Chu appear voluntarily. I am disappointed that our other Witnesses did not offer the same degree of cooperation. Mr. Citron, Mr. Spiel and Ms. Jarino are here serving subpoenas and Mr. Citron only accepted service of the subpoena from him after U.S. Marshals were dispatched to Discord headquarters at taxpayer expense. I hope this is not a sign of your commitment or lack of commitment to addressing the serious issue before us. After I swear in the witnesses, each witness will have five minutes to make an opening statement, then the senators will ask questions in a initial round of seven minutes each.
I hope to take a short break at some point during questioning to allow the witnesses to stretch their legs if anyone needs a break at any point please let my staff know before addressing the witnesses. I would also like to take a moment to acknowledge that this hearing has attracted a lot of attention, as we were hoping that we would have a large audience. largest I have seen today in this room. I want to make it clear that, as in other hearingsof the Judiciary Committee, we ask people to behave appropriately. I know there is a lot of emotion in this room for good reasons, but I ask that you follow the traditions of the committee. that means they cannot stand, shout, sing or clap.
Interruptions from Witnesses will not be tolerated. Anyone who disrupts the hearing will be asked to leave. The Witnesses are here today to address a serious issue. We want to hear what you have to say. I thank you for your cooperation. Could all witnesses take an oath? Do you claim that the testimony you will give before the committee will be the truth? The whole truth and nothing but the truth. So God help you let the record reflect that all the witnesses have answered in the affirmative Mr.citon, please continue with your opening statement. Good morning, my name is Jason Citron and I am the co-founder and CEO of Discord.
We are an American company with around 800 employees living and working in 33 and three states today. Discord has grown even more. over 150 million monthly active users Discord is a communications platform where friends meet and talk online about shared interests, from fantasy sports to writing music to video games. I've been playing video games since I was 5 years old, and as a kid, that's how I had fun and found friendships. Many of my fondest memories are of playing video games with friends. We built Discord so anyone could form friendships playing video games from Minecraft to everything and everything in between gaming has always brought us together and Discord makes that happen today.
Discord is one of the many services that have revolutionized the way we communicate with each other at different times in Our Lives iMessage zoom Gmail and thus enrich our lives create communities accelerate commerce healthcare and education, as with all technology and tools, there are people who exploit and abuse our platforms for immoral and illegal purposes. All of us here today on the panel and across the tech industry have a solemn and urgent responsibility to ensure that everyone who uses our platforms is protected from these criminals, both online and offline, Discord has a special responsibility to Doing so because many of our users are young, more than 60% of our active users are between 13 and 24 years old, that is why security is integrated into everything we do.
It is essential to our mission and our business and, above all, this is deeply personal. I am a father of two children. I want Discord to be a product that they use and love and I want them to be safe on Discord. I want them to be. Proud of me for helping bring this product to the world, which is why I'm pleased to be here today to discuss the important topic of miners' online security. My written testimonial provides a comprehensive overview of our security programs. Here are some examples of how we protect and empower young people first, we have invested our money in security, the technology sector has a reputation for larger companies buying smaller ones to increase the number of users and boost financial results, but the biggest acquisition we've made at Discord was a company called centropy didn't help us expand our

mark

et share or improve our results, in fact, because it uses artificial intelligence to help us identify bans and report criminals and bad behavior, actually has reduced our US user count by getting rid of bad actors.
I've heard of end-to-end encryption that blocks anyone, including the platform itself, from seeing the users. Communications, it's a feature on dozens of platforms but not on Discord, that's a choice we've made, we don't believe we can meet our security obligations if Teens' text messages are fully encrypted because encryption would block our ability to investigate a serious situation and, when appropriate, we inform law enforcement authorities. We have a zero tolerance policy on child sexual abuse material or cesam. We scan images uploaded to Discord to detect and block sharing. Abor Material We've also created an innovative teen safety assistance tool that blocks explicit images and helps teens easily report unwanted conversations.
We have also developed a new semantic hashing technology to detect new forms of cesam called clip and are sharing this technology. With other platforms through the Tech Coalition, we ultimately recognize that improving online safety requires all of us working together, which is why we're partnering with nonprofit law enforcement organizations and our tech colleagues to stay at the forefront of protecting the young people online. We want to be the platform that allows our users to have a better online experience to build true connections, genuine friendships and have fun Senators. I sincerely hope that today is the beginning of a continued dialogue that results in real improvements in online safety.
I look forward to your questions and helping the committee. learn more about Discord, thank you Mr. Sitron Mr. Zuckerberg, Durban Chair, Ranking Member, Graham and Committee Members, Every day teens and young adults do amazing things on our service, they use our apps to create things new. They express themselves, explore the world around them and feel more. connected to the people they care about, in general, teens tell us this is a positive part of their lives, but some

face

challenges online, so we work hard to provide parents and teens with support and controls to reduce potential damage.
Being a parent is one of the most difficult jobs. Around the world, technology gives us new ways to communicate with our children and feel connected to their lives, but it can also make parenting more complicated and it is important to me that our services are positive for everyone who uses them. they use. We are on the side of parents everywhere. Working hard to raise your kids for the past eight years, we've created over 30 different tools, resources and features so parents can set time limits for their teens using our apps, see who they follow or if they report someone. for harassment for teenagers.
We've had nudges that remind you when you've been using Instagram for a while or if it's getting late and you need to go to sleep, as well as ways to hide words or people without those people knowing. We place special restrictions on teen accounts. On Instagram, by default, under-16 accounts are set to private, have the most restrictive content settings, and can't receive messages from adults they don't follow or people they're not connected to who pass by. much of our lives on mobile devices. and social networks, it is important to analyze the effects on the mental health and well-being of adolescents.
I take it very seriously. Mental health is a complex issue and the body of existing scientific work has not demonstrated a cause that links the use of social networks and young people. having worse mental health outcomes, a recent National of Science report evaluated more than 300 studies and found that the research citation did not support the conclusion that social media causes population-level changes in adolescent mental health. . End quote, he also suggested that social media can provide significant positive results. Benefits when young people use it to express themselves, explore and connect with others. Still, we will continue to monitor the research and use it to inform our roadmap.
Keeping young people safe online has been a challenge since the Internet began and as criminals evolve. their tactics, we also have to evolve our defenses, we work closely with authorities to find bad actors and help bring them to justice, but the difficult reality is that no matter how much we invest or how effective our tools are, there are always more, there is always more. We need to learn and make more improvements, but we remain ready to work with the members of this industry committee and parents to make the Internet safe for everyone. I'm proud of the work our teams do to improve child online safety on our services and around the world.
Across the internet, we have around 40,000 people total working in safety and we've invested over $2 billion in this since 2016, including around $5 billion in the last year alone, we have many teams dedicated to child safety and adolescent well-being and we lead In the industry in many of the areas we are discussing today, we create technology to address the worst online risks and share it to help our entire industry improve, such as the Lantern project, which helps companies to share data about people who break child safety rules and us. We are founding members of Take It Down, a platform that helps young people prevent their nude images from being spread online.
We also go beyond legal requirements and use sophisticated technology to proactively uncover abusive material and, as a result, find and report more inappropriate content than anyone else. others in the industry, as the National Center for Missing and Exploited Children put it this week, meta goes above and beyond to ensure that there are no parts of its Network where this type of activity occurs end of quote I hope we can have a substantive discussion Today that drives improvements across the industry, including legislation that delivers what parents say they want: a clear system for age verification and control over the apps their children use.
Three in four parents want age verification in the App Store and four in five want parental approval. You name it, as long as teens download apps, we support this. Parents should have the final say on which apps are appropriate for their children and shouldn't have to upload their ID every time - that's what app stores are for. We also support setting industry standards for age-appropriate content. and limit teen advertising cues to age and location rather than behavior. At the end of the day, we want everyone who uses our services to have safe and positive experiences before they call it quits.
I want to recognize the families who are here today, who have lost a loved one or experienced some terrible things that no family should have to endure. These issues are important to all parents and all platforms. I am committed to continuing to work in these areas and hope that we can make progress today. Thank you. Mr. Speaker, Durban Chairperson, Ranking Member Graham, and members of the committee, thank you for convening this hearing and for pushing for important legislation to protect children online. I'm Evan Spiegel, the co-founder and CEO of snap, we created Snapchat, an online service that is used. by more than 800 million people around the world to communicate with their friends and family.
I know many of you have been working to protect children online since before Snapchat was created, and we are grateful for your long-term dedication to this cause and your willingness to work. together to help keep our community safe. I want to recognize the survivors of online harm and the families who are here today who have suffered the loss of a loved one. Words cannot begin to express the deep pain I feel that a service we designed to bring people together happiness and joy has been abused to cause harm. I want to be clear that we understand our responsibility to keep our community safe.
I also want to recognize the many families who have worked to raise awareness about these issues, pushed for Change, and collaborated with legislators on important issues. legislation like the Cooper Davis Act that can help save lives. I started building Snapchat with my co-founder Bobby Murphy when he was 20 years old. We designed Snapchat to solve some of the problems we experienced online as teenagers that we didn't have. an alternative to social media that meant images shared online were permanently public and subject to popularity metrics, didn't feel great, we built Snapchat differently because we wanted a new way to communicate with our friends that was fast, fun and private, a picture is worth it. thousand words for people to communicate with pictures and videos on Snapchat we don't have public likes or comments when you share your story with friends Snapchat is private by default, meaning people must choose to add friends and choose who can contact them When we created Snapchat, we chose that images and videos sent through our service bedeleted by default, unlike previous Generations who enjoyed the Privacy that comes with unrecorded phone calls.
Our Generation has benefited from the ability to share moments through Snapchat that may not be picture perfect but convey emotions without permanence, although Snapchat messages are deleted by default, we inform everyone that the recipient can save images and videos when we take action on illegal or potentially harmful content. We retain evidence for an extended period, allowing us to support law enforcement and hold criminals accountable to help prevent the spread of harmful content on Snapchat We approve recommended content on our service through a combination of automated processes and human review We apply our content rules consistently and fairly across all accounts.
We run samples of our compliance actions through QA to verify we're doing it right. We also proactively scan for known child sexual abuse material, drug-related content and other types of harmful content, remove that content, disable and block devices, offending accounts preserve evidence for law enforcement and report certain content to relevant authorities. for additional measures to be taken. Last year we made 690,000 reports to the National Center for Missing and Exploited Children, leading to more than 1,000 arrests. We also removed 2.2 million pieces of drug-related content and blocked 75,000 associated accounts Even with our strict privacy settings, content moderation efforts, proactive detection, and collaboration with law enforcement, bad things can still happen when People use online services, which is why we believe that people under the age of 13 are not ready to communicate on At Snapchat, we strongly encourage parents to use device-level parental controls on iPhone and Android.
We use them in our own home and my wife approves of every app our 13 year old son downloads. For parents who want more visibility and control, we created a Family Center. and Snapchat, where you can see who your teen is talking to, review privacy settings, and set content limits. We have worked for years with committee members on legislation like the Child Online Safety Act and the Cooper Davis Act, which we are proud to support. I want to encourage broader industry support for legislation that protects children online. No legislation is perfect, but some traffic rules are better than none.
Much of the work we do to protect the people who use our service would not be possible without the support of our partners around the world. industry, government, non-profits, NGOs and, in particular, law enforcement and first responders who have committed their lives to helping keep people safe. I am deeply grateful for the extraordinary efforts across our country and around the world to prevent criminals from using online services to perpetrate their crimes. I feel an overwhelming sense of gratitude for the opportunities this country has provided me and my family. I feel a deep obligation to give back and make a positive difference and I am grateful to be here today as part of this vitally important Democratic process. members of the committee I give you my commitment that we will be part of the solution for online safety we will be honest about our shortcomings and we will continually work to improve thank you and I look forward to answering your questions thank you Mr Sple Mr Chu president of Durban, Graham, member high-ranking officials and committee members.
I appreciate the opportunity to appear before you today. My name is Show Chu and I am the CEO of Tik Tok, an online community of over one billion people worldwide, including over 170 million Americans. Use our app every month to create, share and discover Now, although the average age on Tik Tok in the US is over 30, we recognize that special safeguards are required to protect minors and especially when it comes to combat.all forms of ceasing As a father of three young children, I know that the topics we are discussing today are horrible and every parent's nightmare.
I am proud of our efforts to address the threats young people face online from a commitment to protection. to our industry-leading policies, use of innovative technology, and significant ongoing investments in trust and security to achieve this goal. Tik Tok is vigilant about enforcing its 13+ policy and offers a teen experience that is much more restrictive than you and I would like us, as adults, to make careful product design decisions to help make our app inhospitable to those who seek to harm adolescents. Let me give you some examples of long-standing policies that are unique to Tik Tok. We didn't apply them last week.
First. Direct messaging is not available for users under 16 years of age. Second accounts for people under 16 are automatically set to private along with their content. Additionally, the content is not downloadable and will not be recommended to people you don't know. Thirdly, all teams under 18 years of age. It has a screen time limit automatically set to 60 minutes and fourthly, only people over the age of 18 can use our live streaming feature. I'm proud to say that Tik Tok was one of the first to empower parents to supervise their teens on our app with our family matching tools, this includes setting screen time limits, filtering content from feeds adolescents, among other things, we make these decisions after consulting with doctors and safety experts who understand the unique stages of adolescent development to ensure we have adequate safeguards to prevent harm. and minimizing risk now security is one of the core priorities that defines Tik Tok under my leadership.
We currently have more than 40,000 trust and safety professionals working to protect our community globally and we expect to invest more than two billion dollars in trust and safety efforts this year. year alone, with a significant portion of that in our U.S. operations, our strong community guidelines strictly prohibit content or behavior that puts teens at risk of exploitation or other harm, and we vigorously enforce them. Our technology moderates all content uploaded to our app to help quickly identify potential CSAM and other material that violates our rules, automatically removes the content, or escalates it to our security professionals for further review.
We also moderate direct messages for csam and related material and use third party tools such as Photo DNA and remove it from combat cesam to prevent content. Since we uploaded them to our platform, we continually meet with parents, teachers and adolescents. In fact, I sat down with a group just a few days ago. We use their insights to strengthen protections on our platform and also work with leading groups like The Technology Coalition. The measures we are taking to protect teens are a critical part of our broader trust and safety work as we continue our voluntary and unprecedented efforts to build a safe data environment for us, the users, by ensuring that our platform remain free from external manipulation and implementation. safeguards in our recommendation and content moderation tools keeping teens safe online requires a collaborative effort as well as collective action we share the commune's concern and commitment to protecting youth online and welcome the opportunity to work with you in legislation to achieve this goal our commitment is continuous and unwavering because there is no finish line when it comes to protecting adolescents.
Thank you for your time and consideration today. I am happy to answer your questions. Thank you Mr Jarino, Chairperson for Durban, Ranking Member for Durban, Graham and esteemed members of the committee. Thank you for the opportunity to discuss my ex's work to protect Miss Yarina, could you check to see if her microphone is on? My talk button is off and you, how come? Thank you very much, please adjust my chair, apologies, start again, Chair, Ranking Member for Durban, Graham, and esteemed members of the committee thank you for the opportunity to discuss X's work to protect the safety of minors online today's hearing is titled a crisis that demands immediate action as a mother this is personal and I share the sense of urgency and part of this solution, while I only joined leadership measures this new company was taking to protect children Less than 1% of US users on X has made material changes to protect minors our policy is clear of preparation and identification of alleged csse victims we have also strengthened our application with more tools and technology to prevent bad actors from distributing, searching and interacting with csse content if the csse content is published on X, we remove it and now we also delete any accounts that interacts with CSC content, whether real or computer-generated last year 850,000 reports were sent to necmi, including our first automatically generated report, that's eight times. more than reported by Twitter in 2022 we have changed our priorities we have restructured our trust and safety teams to remain strong and agile we are building a trust and safety center of excellence in Austin Texas to bring in more internal agents to accelerate our Impact we are applying to the technology coalition's Pro Lantern project to achieve greater progress and impact across the industry.
We have also opened up our algorithms for greater transparency. We want the United States to lead this solution. X praises the Senate for passing the report law and we support it. the shield law it is time for a federal rule to criminalize the exchange of non-consensual intimate material we need to raise standards throughout the Internet ecosystem, especially for those technology companies that are not here today and do not step up X supports the stop Cesam Law The Child Online Safety Act must continue to advance and we will support continuing to engage with it and ensure the protection of freedom of expression.
There are two additional areas that require everyone's attention first, as a police officer's daughter must. have the critical resources to bring these bad criminals to justice; Second, with artificial intelligence, criminals' tactics will continue to become more sophisticated and evolve. Industry collaboration is imperative here. x believes that freedom of expression and platform security can and should coexist. We agree that now is the time to act urgently thank you I hope to answer your questions thank you very much Miss Yaro now we will enter into rounds of questions of 7 minutes each for the members also uh I would like to take note of your testimony Michelle Gino I believe you we are the first social media company to publicly support the camera law, is our honorary president, that is progress, my friends, thank you for doing that.
I'm still going to ask some probing questions, but let me get to the bottom of it. I'm going to focus on my legislation on cessation, which says that it is civil liability if you intentionally or knowingly host or store child sexual abuse materials or make available child sexual abuse materials secondly, promote or aid and abet to an intentional or knowing violation of child sexual exploitation laws. Is there anyone here who believes that you are notShould Li be considered a civilian for that type of behavior? Mr. Citron, good morning, President, you know, we strongly believe that this content is disgusting and that there are many things about the bill to stop CM that I think are very encouraging and we are very supportive of adding more resources to the line of cyber information and modernization along with giving more resources to nech um and um, we are very open to having conversations with you and your team to talk a little more about the details of the bill.
I'm sure I'd like to do that because if you intentionally or knowingly host or store a camera, I think you should at least be held civilly liable. I can't imagine anyone disagreeing with that, yeah. It's disgusting content, that's certainly why we need you to support this legislation, Mr. Spiegel. I want to tell you that I am listening carefully to his testimony here and it has never been a secret that Snapchat was used to send sexually explicit images in 2013, early in the history of his company. you admitted this in an interview, do you remember that interview, Senator?
I don't remember this specific interview, you said that when you were first trying to get people on the app, you quoted, you approached people and said, hey, you should try this. In this app you can send photos that disappear and they would say, oh, for sexting, remember that interview with the senator when we first created the app? It was actually called pikaboo and the idea was to make the pictures disappear, the feedback we got from people using the app is that they were actually using it to communicate, so we changed the name of the app to Snapchat and found that people was using it to speak visually as early as 2017, authorities identified Snapchat as the sexual exploitation tool of pedophiles in the case of a 12-year-old girl identified in court only as LW shows the danger for two years and media when a predator sexually harassed her by sending her sexually explicit images and videos through Snapchat.
The man admitted that he only used Snapchat with LW and not other platforms because The date knew the chats would disappear, did you or did you and everyone else on Snap really not realize that the platform was the perfect tool for predators sexual? Senator, that behavior is disgusting and reprehensible. We provide reporting tools within the app so that people who are being harassed or who know that inappropriate sexual content has been shared can report it in the case of harassment or sexual content, we generally respond to those reports within 15 minutes so that we can provide help when LW, the victim sued Snapchat, her case was dismissed under Section 230 of the Communications Decency Act, do you have any doubt that if she had faced the prospect of civil liability for facilitating sexual exploitation, the company would have implemented even better safeguards?
Senator, we already work extensively to proactively detect this type of behavior, we make it very difficult for predators to find teens on Snapchat there are no public friend lists or public profile photos uh, when we recommend friends for teens, we make sure they have several mutual friends before making that recommendation. We believe those safeguards are important to prevent predators from misusing our platform, Mr.citon, according to the Discord website, a proactive and automated security approach is needed only on servers with more than 200 members. Smaller servers rely on server owners and community moderators to define and enforce behavior.
So how do you defend a security approach? which relies on groups of fewer than 200 sexual predators to report each other for things like grooming, swapping a camera, or sexual chair torture. Our goal is to remove all such content from our platform and ideally prevent it from being displayed. first of all or from people who engage in these types of horrible activities um we implement a wide range of techniques that work on all surfaces of our um on Discord um I mentioned that we recently launched something called teen safety assist that works everywhere and is on by default for teen users it acts as a friend letting them know if they are in a situation or talking to someone that may be inappropriate so they can report that to us and block that user so mr citrone if if they were working, we would not be here today, sen CH, this is an ongoing challenge for all of us, this is why we are here today, but we have, 15% of our company focuses on trust and security of This is one of our main problems.
There are more people than we have working in

mark

eting. I'm the company, so we take these issues very seriously, but we know it's an ongoing challenge and I look forward to working with you and collaborating with our tech colleagues and nonprofits to improve our approach. I certainly hope so, Mr. Chu, your organization and business are one of the most popular among children. Can you explain to us what you are doing in particular and whether you have seen any evidence of cessation in your business, yes Senator, we have a strong commitment to investing in trust and security and, as I said in my opening speech, I intend to invest more Of $2 billion in trust and security this year alone, we have 40,000 security professionals you know working on this issue.
We have created a specialist child safety team to help us identify specialist problems, horrible U-material problems like the ones you mentioned. If we identify any on our platform and proactively detect them, we will remove them and report them to Nick. and other authorities, why does Tik Tock allow children to be exploited for commercialized sexual acts? Senator. I respectfully disagree with that characterization. Our live streaming product is not for anyone under 18 years of age. We have taken steps to identify anyone who violates that and removed them from using that service at this point. I'm going to turn to my ranking member, Senator Graham, thank you, Mr.
Citron, you said we need to start a discussion, be honest with you, we've been having this discussion for a long time, we need to get a result, not an argument, right? do you agree with that high ranking member? I agree, this is a topic that we have also been very focused on since we started our company in 2015, but this is the first time that we are familiar with the won act written by me and C blumthal um a little bit yeah, okay Do you support it? um, we like it, yes or no. uh, we are not prepared to support it today, but we believe that the section supports the cesam law um, the stop cesam law um, we are not prepared to support it today, but we, you support the shield law, uh, we believe that the information line cybernetic yes. do you use it yes or no, uh, we believe that the cyber reporting line and Ne, I'll take it as no, the act of saving the project's childhood, do you support it, um, we believe that I'll take it as no, the report act?
It supports? um Senior Member, Graham, we look forward to having conversations with you and your team. I hope to pass the bill that will solve the problem. Do you support eliminating Section 230 liability protections for social media companies? um, I believe in that section. 230 um, it needs to be updated, it is a very old law. Do you support repealing it so that people can sue if they believe they are harmed? I think section 230 as written, while it has a lot of downsides, has allowed for innovation on the Internet, which I think has largely been a big thank you, so here you go, if you're waiting for these guys to solve the problem, we are GNA waiting to die, Mr.
Zuckerberg, sir, try to be respectful here, the representative from South Carolina, sir. Guffy's son got caught in a ring of sexual distortion in Nigeria using Instagram. They shook him, he paid money that was not enough and he committed suicide using Instagram. What would you like to say? It's awful. I mean, no one should have done it. to go through something like that do you think he should be allowed to see you um I think they can sue us well I think they should and they can so the bottom line here folks is this committee is done talking we passed five bills per unanimity. that in their different ways and look who did this Graham Blumenthal Durban Holly cloar cornin cornin cloar Blackburn and oof, I mean, we found common ground here that's amazing and we've had hearing after hearing, Mr.
President, and the bottom line is that I 'We have come to the conclusion, gentlemen, that you are not going to support any of this Linda, how do you say your last name, yes, Kino, do you support the Law obtained? We strongly support collaboration to increase industry practices to support. support the law of winning in English you support the law of winning yes or no we do not need to speak twice we hope to support and continue our conversations it is okay, as you can see, but you have done it, you have taken the reason of the law of winning is important, You can actually lose your responsibility in productions when children are exploited and you did not use the best business practices.
Look at the Earn It law it means you have to earn responsibility in production, they give it to you no matter what you do to the members of this. committee, the time has come to ensure that the people holding the signs can sue on behalf of their loved ones. Nothing will change until the courtroom door is open to social media victims. 2 billion dollars, Mr. Chu, how much is the percentage of what you did last year Senator, it is a significant and growing investment as a private company, we are not sh. I mean, 2% is the percentage of your income.
Senator, we are not ready to share our finances publicly. Well, I just think two billion dollars sounds good. a lot, unless you make 100 billion, so the point is you know when you tell us you're going to spend $2 billion great, but how much do you do to know that it's all about eyeballs? Well, our goal is to draw attention to you and it's just not about kids, I'm talking about the damage that's being done, do you realize Mr. Chu that your Tik Tock representative in Israel resigned yesterday? Yes, I know, okay and he said I quit Tiktok.
We are living in a time when our existence as Jews and Israel and Israel is under attack and in danger. Multiple screenshots taken from Tik Tok's internal employee chat platforms known as lar show how Tik Tok trust and security officials celebrate the barbaric acts of Hamas and other Iranian-backed terrorist groups. including hoodies in Yemen Senator. I need to make it very clear that pro-Hamas content and hate speech are not allowed on our Why did you resign? Why did you resign? Senator. We don't allow any either. Why did you resign? Do you know why he resigned? don't allow this, we will investigate my question: did he resign?
I'm sure he had a good job. You quit a good job because you believe your platform is being used to help people who want to destroy the legal state and I'm not saying you are. I want that Mr. Zuckerberg I'm not saying that you as an individual want any of the damages. I am saying that the products you have created with all the advantages have a dark side. Mr. Citron, I'm tired of talking, I'm tired of having arguments about us. we all know the answer here and here is the definitive answer support their prod go to the American courtroom and defend their practices open the courtroom door until they do that nothing will change until these people can be sued for the damage they are causing everything I'm a Republican who believes in free enterprise, but I also believe that every American who has been wronged has to have someone to go to and complain to.
There is no commission to go to that can punish you. There is not a single law in the world. book C they oppose everything we do and can't sue them, that has to stop people, how do they expect people in the audience to believe that we are going to help their families if we don't have some system or combination of systems to hold these people accountable because despite all the advantages the dark side is too big to live with we don't need to live this way as Americans thank you senator graham uh senator cloar is next she has been quite a leader in the topic, uh uh, for quite some time. long time on the shield bill and with Senator Corin on the revenge porn legislation Senator clubo thank you very much Chairman Duran and thank you Ranking Member Graham for those words I couldn't agree more for too long , we've been seeing the social media companies turn a blind eye when kids join these platforms.
Platforms in record numbers have used algorithms that push harmful content because that content became popular. They provided a place, perhaps unknowingly at first, but for traffickersThey sold deadly drugs like fentanyl. The head of our DEA himself has said that they have basically been captured by the cartels in Mexico and China, so he strongly supports, first of all, the Sto Cam bill. I agree with Senator Gram that nothing is going to change unless we open the courtroom. doors I think time is up for all this immunity because I think money speaks even louder than we talk up here, two of the five bills, as noted, are bills of mine with Senator Corin, one it actually passed the Senate but it's awaiting action in the house uh but the other one is The Shield law and I support uh I appreciate the XS support of that bill this is about revenge porn uh the FBI director testified before this committee that there have been over 20 child suicides attributed to online revenge porn in just uh last year, but for those parents and those families, this is for them about their own child, but it's also about making sure that this don't happen to other children.
I know this because I've talked to these parents, parents like Bridget Noring of Hastings Minnesota, who is there today, Bridget lost her teenage son after she took a fentanyl pill that she bought on the Internet. Amy Neville is also here on the platform, she got the pill. Amy Neville is here too. Her son Alexander was only 14 when he later died. taking a pill that I didn't know was actually fentanyl, we're starting a law enforcement campaign, one pill kills in Minnesota, going into schools with sheriffs and authorities, but the way to stop it is yes, at the border and at points of Entry, but we know that 30% of the people who get fentanyl get it from the platforms, while social media platforms generated 11 billion in revenue in 202, two of them from targeted advertising children and teenagers, including almost 2 billion in advertising revenue derived from users under 12 years old when a Boeing plane lost a door in mid-flight Several weeks ago no one questioned the decision to ground a fleet of more than 700 planes, so, Why aren't we taking the same kind of decisive action on the danger of these platforms when we know these children are dying, we have bills that have gone through this incredibly diverse committee as far as our political views go, that have passed by this committee and they should go to the floor, we should finally do something about liability and then we should turn to some of the other issues that that number of us have worked on when it comes to app store charges and what regarding Monopoly behavior and self-preference, but I'll stick with Here's a fact today: One-third of fentanyl cases investigated over five months had direct links to social media, i.e. the DEA, between 2012 and 2022, cybertipline reports of online child sexual exploitation increased from 415,000 to over 32 million and as I noted, at least 20 victims committed suicide in cases of sexual exploitation, so I'm going to start with that, with you, Mr.
Citron, my bill with Senator Corin The Shield Act includes a threat provision that would assist in the protection and accountability of those who are threatened by these young predators. kids get a photo and send it they think they have a new girlfriend or a new boyfriend ruins their life or they think it will be ruined and they commit suicide so can you tell me why you don't support the shield law? Senator, we, we. I think it is very important that teenagers have a safe experience on our platforms. I think the part of strengthening law enforcement's ability to investigate crimes against children and hold bad doctors accountable is for you to be open and supportive.
We would very much like to have conversations with you. We are open to continue discussing. We welcome legislation and regulation. You know that this is a very important issue for our country and you know that we have been prioritizing safety. Thank you. I'm much more interested in whether you support it because there's been a lot of talk at these hearings and popcorn thrown and stuff like that and I just want to do this. I'm so tired of this it's been 28 years since the internet. We haven't passed any of these bills because they all talk double talk, it's time to pass them and the reason they haven't passed is because of the power of your company, so let's be very clear about this, so whatever you say matters. your words matter um um Mr.
Chu, I am a cosponsor of the President Derad Stop Cesam Act of 2023 along with Senator Holly is the leading Republican, I believe, which, among other things, empowers victims by making it easier for them to ask the companies of technology that removes related material and images from their platforms, why would you not support this bill? Senator, we greatly support you. I think his spirit is very aligned with what we want to do. There are implementation questions that I think companies like us and a few others. The groups have done this and we hope to ask them and of course if this legislation is law we will comply with it.
Mr. Spiegel, I know we spoke in advance. I appreciate your company's support of the Cooper Davis Act, which will finally be a bill. uh with Senator Shaheen and Marshall uh, which will allow authorities to do more when it comes to fentanyl. I think you know what a problem this is, Devon and the Hastings teens. I mentioned that her mother here suffered from dental pain and migraines, so she bought what she thought. was an over the top peret, but instead he bought a counterfeit drug lace with a lethal dose of fentanyl from Leal, while his mother, who is here with us today, said that all the hopes and dreams we had as parents for Devon were gone. erased in the blink of an eye and no mom should have to bury her kit, talk about why you support the Cooper Davis Act Senator, thank you, we strongly support the Cooper Davis Act and believe it will help go after the cartels and take out more traffickers off the streets to save more. live well, are there others who support that bill on this?
Not okay, last, Mr. Zuckerberg in 2021. The Wall Street Journal reported on internal affairs investigation documents asking why we care among these were internal documents. I'm citing the docs and answering your own question by citing internal meta emails, they are a valuable but untapped audience in a Commerce audience. I'm also on that committee. I asked meta's head of global security why 10-12 year olds are so valuable to meta, she replied that we don't knowingly try. to recruit people who are not old enough to use our applications well when the 4 state attorneys general, Democrat and Republican, presented their case, they said this statement was inaccurate.
Some examples In 2021, Ms. Davis received an email from Instagram's director of research that said Instagram is investing and experimenting targeting youth approximately 10 to 12 years old. In an instant message from February 2021, one of its employees wrote that Meta is working to recruit Jen Alpha before she reaches adolescence. A 2018 email that circulated within meta says it was informed that Children under 13 will be instrumental in increasing the acquisition rate when users turn 13. I explained that with what I heard in that testimony at the Commerce hearing, they were not being attacked and I simply asked again how the other Witnesses were asked why their company.
He does not support the stop C Sam law or the Sher Senator shield law. I'm happy to talk to the two we had conversations with internally about whether we should create a kids version of Instagram like the kids versions of YouTube and other services. We haven't actually moved forward with that and currently have no plans to do so, so I can't speak directly to the exact emails. that you mentioned, but it seems to me that there were deliberations on a project that people internally thought was important and we did not end up moving forward, okay, but what about the bills, what are you going to say about the two bills?
In general, I want to say that my position on the bills is that I agree with the objective of all of them, there are most of the things that I agree with within them, there are specific things that I would probably do differently We also have our own legislative proposal for what we think would be most effective in terms of helping the Internet across the various companies, giving parents control over the experience, so I'm happy to go into detail on any of those, but in Ultimately, I mean, I think I'll say this again, well, I think these parents will tell you that things haven't worked out, just give control to the parents, they don't know what to do, it's very, very difficult and that's why We are coming up with other solutions that we believe. they're much more useful to law enforcement, but also this idea of ​​finally getting something done about accountability because I just think that with all the resources that you have, you could really do more than what you're doing or these parents wouldn't be sitting behind you now. same in this Senate hearing room, thank you Senator, talk about that or do you want G to come back later, go ahead, I don't think parents should carry ID or prove that they are the parents. of a child on every app your children use.
I think the right place to do this and a place where it would be really very easy for it to work is within the app stores, which, as I understand it, are already Apple and Google. At least Apple already requires parental consent when a child makes a payment with an app, so it should be pretty trivial to pass a law requiring them to make parents have control every time a child downloads an app and offers consent. of that and the research that we've done shows that the vast majority of parents want that and I think that's the kind of legislation in addition to some of the other ideas that you all have.
This would make this a lot easier for me, to be clear, I remember a mother telling me that with all these things she could maybe do, she can't understand that it's like a faucet overflowing in a sink and she's out there with a mop while your kids are getting addicted to more and more different apps and being exposed to material. We need to simplify this for parents so they can protect their children and I don't think this is the way to do it. I think the answer is what Senator Graham has been talking about, which is to open up the hallways of the courtroom, so that it's up to you to protect these parents and protect these children and then also pass some of these laws. that make things easier.
Law enforcement, thank you Senator Cloar, we are going to try to stick to the seven minute rule, it didn't work very well, but we are GNA. I will try to give the other side additional time as well, Senator Corin, there is no Question that your platforms are very popular um but we know that U while here in the United States we have an open society and a free exchange of information that there are authoritarian governments there are criminals who will use their platforms for selling drugs for extortion sex and things like that, and um, Mr. Chu, I think your company is unique among those represented here today because of its ownership, uh, of Bite Dance, a Chinese company and I know there have been some steps that you have taken to uh.
Block data collected here in the United States, but the fact of the matter is that under Chinese law and Chinese national intelligence laws, all information accumulated by companies in the People's Republic of China must be shared with intelligence services Chinese. uh, the initial launch of Tik Tock, as I understand it, was in 2016 uh, these efforts that you made with Oracle under the so-called Texas project to block US data were in 2021 and apparently they supposedly completely blocked in March of the 23 what happened to all The data that Tik Tock collected before that Senator, thanks from American users, understand that Tik Tok is owned by Bit Dance, which is majority owned by Global Investors and we have three Americans on the board of five.
You are right to point out that over the last three years we have spent billions of dollars creating tax projects, which is an unprecedented plan in our industry, the firewall wall of data protected from the rest of our staff and I am asking about all of the data that you collected before that event uh yes Senator, we have initiated a data deletion plan. I talked about this a year ago. We have completed the first phase of removing data across our data centers outside of the Oracle Cloud infrastructure and are beginning phase two, where we will not only remove data from the data centers, but will hire a third party to verify that work. and then we will go into, for example, employees working on laptops to also delete all data collected by Tik Tock before the project that Texas shared withthe Chinese government in accordance with the National Intelligence laws is that that country, Senator, the Chinese government has not asked us for any data and we have never provided it to them.
Your company is once again unique among the uh. represented here today because it is currently being reviewed by the committee on foreign investment in the United States, that is correct Senator, yes, there are ongoing discussions, uh, and much of our project work in Texas is based on the year that the discussions of the album with many agencies. The Cyphus Umbrella Well Cifas is specifically designed to review foreign investments in the United States for national security risks. Right, yes, I do, and your company is currently being reviewed by this Treasury Department Interagency Committee for potential national security risks.
Senator this The review is about a musical acquisition, which is an AC that was made many years ago. I mean, is this an informal conversation or are you actually providing information to the Treasury Department about how your platform operates to evaluate a National Security potential? Senator Risk, it's been a lot of years in two administrations and a lot of discussions about what our plans are like, how our systems work, we have a lot of robust discussions about a lot of details. That's what 63% of teenagers I understand use Tik Tock. Sounds good, Senator, I can't verify that we know that we are popular with many age groups.
The average age in the US today for our user base is over 30, but we are aware that we are popular and you reside in Singapore with your family. Yes, I reside in Singapore and work here in the United States as well. Do your children have access to Tik Tock in Singapore? Senator, if they lived in the United States, I would give them access to our under-13 experience, my children. They are under 13 years old, my question is in Singapore, do they have access to Tik Tok or is it restricted by national law? We don't have a PG-13 experience in Singapore, we do have one in the US because we're considered a mixed audience app and, uh, we created a PG-13 experience in response to The Wall Street Journal article published yesterday directly contradicts what his company has stated publicly, according to the magazine, Texas project employees say that we, user data, including users' emails, date of birth.
IP addresses continue to be shared with Bite Dance staff, again owned by a Chinese company, don't you dispute that, Senator? There are many things about that article that are inaccurate. The correct thing is that this is a voluntary project that we built and spent billions. of dollars there are thousands of employees involved and it is very difficult because it is unprecedented why is it important that the data collected from us users be stored in the United States? Senator, this was a project that we built in response to some of the concerns. that were raised by members of this committee and others and that were due to concerns that the Chinese Communist Party could access data stored in China, in accordance with National Intelligence laws, Senator correct, we are not the only company that it does. business, uh, you know, you have Chinese employees, for example, we're not even the only company in this room that hires Chinese nationals, but to address some of these concerns we've moved the data to the Oracle Cloud infrastructure, we built a template of 2000 people. team to oversee the management of the data based here, we separate it from the rest of the organization and then open it up to third parties like Oracle and we will bring in others to give them third-party validation.
This is an unprecedented excess. I think we are unique in taking even more steps to protect user data in the United States. Well, you have questioned the Wall Street Journal story published yesterday. Are you going to do some kind of investigation to see if there is any truth to the allegation made in the article or are you just going to rule them out completely, we're not going to rule them out, which is why we have ongoing safety inspections not only by our own staff but also third parties to ensure that the system is rigorous and robust, no system that we one of us can build is perfect, but what we need to do is make sure that we are always improving it and testing it against bad people who may try circumvent it and if anyone violates our policies within our organization we will take disciplinary action against them thank you Senator Corin Senator Coons thank you President Durban um first I would like to start by thanking all the families that are here today all the parents that are here for a child that They have lost all the families who are here because they love us.
To see him and know his concern, he has contacted each of us in our offices expressing his pain, his loss, his passion and his concern, and the audience that is looking you can't see this, they can see you, the companies' witnesses, but this room. It is packed as far as the eye can see and as this hearing began, many of you picked up and held photographs of your beloved and lost children. I benefit from and participate in social media, as do many members of the committee and our nation and our world there. There are now a majority of people on Earth participating and in many ways benefiting from one of the platforms that you have launched, lead or represent and we have to recognize that there are some real positives to social media: they have transformed modern life, but It also had enormous impacts on families and on children at the United Nations and there are a whole series of bills advocated by members of this committee that try to address illicit drug trafficking, trafficking in illicit child sexual material, things that are facilitated on their platforms that can lead to self-harm or suicide, from what we have heard from several of the leaders of this committee, the president and very talented and experienced senators, the framework that we are looking at is consumer protection when there is any new technology we implement. set regulations to make sure it's not too damaging, as my friend Senator Kashar pointed out, a door flew off a plane, no one was hurt, and yet the entire Boeing fleet of that type of plane was grounded and a appropriate federal agency for that purpose I did an immediate security review.
I'm not going to point out the other laws that I think it is urgent that we pass and pass, but rather the central issue of transparency if you are a company that makes a product that is supposedly addictive. and harmful, one of the first things we look for is safety information that we try to provide to our constituents, to our consumers, warning labels that help them understand what the consequences of this product are and how to use it safely or not. As some of my colleagues have heard clearly, if you sell an addictive, defective and harmful product in this country in violation of regulations and warnings, you will be sued and what is distinctive about platforms as an industry is that most families that are here are here because there weren't enough warnings and they can't effectively sue you so let me dig in for a moment if I can because each of your companies voluntarily discloses information about content and security.
The investments you make and the actions you take. There was a question, I think it was from Senator Graham. Earlier on Tik Tock, I think, Mr. Chu, you said to invest 2 billion dollars in security. My background memo said its global revenue is $85 billion. Mr. Zuckerberg, my background memo says that you are investing 5 billion in security and meta and that your annual income is in order. of 116 billion, then what matters, you can hear some expressions from the parents in the audience, what matters are the relative numbers and the absolute numbers, your data friends, if there is anyone in this world who understands data, they are you, so I want to analyze whether these voluntary measures of disclosure of content and harm are not sufficient because I would say that we are here because we do not lack better information?
How can policymakers know if the protections that you have attested to about new initiatives, initial programs, monitoring and removals are actually working, how can we meaningfully understand how big these problems are without measuring and reporting? data? Mr. Zuckerberg, your testimony referenced a National Academy of Sciences study that said at the population level there is no evidence about harm to mental health, well, it may not be at the population level, but I'm looking. a room full of hundreds of parents who have lost their children and our challenge is to take the data and make good decisions to protect families and children from harm, so let me ask you what your companies report or do not report and I will especially focus on its content. policy on self-harm and suicide and I'm just going to ask a series of yes or no questions and what I mean is: Do you, Mr.
Zuckerberg, sufficiently disclose your policies prohibiting content about suicide or self-harm? Do you provide an estimate? of the total amount of content, it is not a percentage of the total, it is not a prevalence number, but the total amount of content on your platform that violates this policy and do you report the total number of views that content that promotes self-harm gets or suicides that violate this policy? on your platform, yes, senator, we were pioneers in reporting quarterly on our community standards and their application across all of these different categories of harmful content, we focus on the prevalence that you mentioned because what we focus on is the percentage of the content we take. downup identif very talented I have very little time left.
I'm trying to get an answer to a question, not as a percentage of the total because remember it's a huge number so the percentage is small, but do you report the actual amount of content and the number of views self-harm content received? I think we focus on the prevalence right you don't miss yarina yes or you didn't report or not Senator what as a reminder we have less than 1% of our users who are between the ages of 13 and 17 do you report the absolute number of how many images and How many posts and accounts have we deleted in 2023?
Yes, we have deleted over a million posts related to mental health and myself, Mr. Chu do. you reveal the number of occurrences of this type of content and how many are viewed before they are removed Senator, we reveal the number we removed based on each violation category and how many of them were proactively removed before it was reported, Mr .sple, yes, senator, we reveal, Mr. quoten, yes, we do. I have three more questions I would love to answer. If I have unlimited time, I will present them for the record. The most important point is that platforms need to deliver more content. about how the algorithms work, what the content does and what the consequences are, not at the aggregate level, not at the population level, but the real number of cases so that we can understand the content, in closing, Mr.
President, I have a project of bipartisan law, the platform transparency and responsibility law. co-sponsored by Senators Corin Cloar Blumenthal on this committee and Senator Cassidy and others, is before the Commerce Committee, not this committee, but would establish reasonable standards of disclosure and transparency to ensure that we are doing our work based on data. Yes. There is understandably a lot of excitement in this field, but if we are going to responsibly legislate around the management of content on their platforms, we need to have better data. Are any of you willing to say now that you support this bill, Mr.
President? let the record reflect a yawning silence from the leaders of the social media platforms thank you, thank you, Senator Coons, we are on one of two, the first of two roll calls, so please understand that if some of the members leave and come back, it's not disrespectful. They are doing their job, Senator Lee, thank you, Mr. President, they survive tragically. Sexual abuse survivors are often repeatedly victimized and revictimized over and over again by having non-consensual images of themselves on social media platforms. There's a Nick Mik. study that noted that there was a sewing case that reappeared more than 490,000 times after it was reported after it was reported, so we need tools to deal with this, we need, frankly, laws to require standards, so let this not happens so that we have a systematic way to get rid of these things because there is literally no plausible justification, there is no way to defend this, one tool that I think would be particularly effective is a bill that I will introduce later today and I invite everyone members of my committee to join me.
It's called the protection law. The protection law would require in partIt is pertinent for websites to verify the age and verify that they have received consent of each and every person who appears on their site with pornographic images and also requires that platforms have meaningful processes for a person seeking to have images removed from himself in a timely manner, Miss Yaro, based on your understanding of the existing law, what does it take for a person to have those images deleted, says X, Senator Lee, thank you, it sounds like what you are going to introduce into the law in terms of user consent and the entire ecosystem it sounds exactly like part of the philosophy of why we support the shield law. and no one should have to put up with non-consensual image sharing online, yes, and without that, without laws in place and it's great any time a company, like you described with yours, wants to take those actions, it's very helpful, it can take a lot more than you should and sometimes to the point where someone shared images that you shared 490,000 times after it was reported to the authorities and that is deeply disturbing, but yes, the protection law would work together It is a good complement to the shield.
Act, Mr. Zuckerberg, let's move on to you next, as you know. I have a strong concern for privacy and believe that one of the best protections for an individual's privacy online involves end-to-end encryption. We also know that a lot of cesam grooming and sharing happens in end-to-end encrypted systems, tell me, does meta allow youth accounts on your platforms to use encrypted messaging services within those apps? Sorry, Senator, what do you mean under 18 years old? We allow people under 18 to use WhatsApp and we allow it to be encrypted. Yes, do you have a minimum age level where they are not allowed to use it?
Yeah, no, I don't think we allow people under 13, yeah, um. How about you, Mr. Citron, on Discord? Has? Do you allow children to have accounts to access encrypted messages? Discord cannot be used by children under 13 years old and we do not use final encryption for text messages, you know? We think it's very important to be able to respond to law enforcement requests appropriately, um to uh, and we're also working proactively on Building Technology. We are working with a non-profit called Thorn to build a grooming sorter to keep our teens safe. The support feature can identify these conversations if they might be happening so we can intervene and give those teens tools to get out of that situation or potentially even report those conversations and those people to the authorities and encryption as much as possible.
It can be useful in other places, it can be harmful, especially if you are on a site where you know that children are being groomed and exploited if you allow children to access an app that is enabled for end-to-end encryption, that can be problematic now, let's get back to you M stop in a moment, Mr. Zuckerberg Instagram recently announced that it will restrict all teens' access to eating disorder material, suicidal ideation-themed material, self-harm content and that's great, that's great, um, what's strange, what I'm trying to understand is why is Instagram only restricting, it's restricting access to sexually explicit content, but only for teenagers 13 to 15 years old. 15 Why not also restrict it to 16 and 17 year olds?
Senator, I understand that we do not allow sexual relations. explicit content on the service for people of any age um, um, how did it go? You know, our prevalence metrics suggest that I think about 99% of the content that we remove we can automatically identify using AI. system, so I think that in this, while they are not perfect, I think they are leaders in the industry. The other thing you asked about was self-harm content, which is what we recently restricted and made that change. I think the state of science is changing a bit. We used to believe that when people thought about self-harm it was important that they were able to express it and get support, and now more the thinking in the field is that it's simply better not to show that content at all, which is why we recently decided to restrict it so that it doesn't show those teenagers.
Okay, is there a way for parents to make a request about what their kids can or can't see? There are a lot of parental controls on their sites uh. I'm not sure if they exist. I don't think we currently have control over the themes, but we do allow parents to control the time the children are on it. and also a lot of this is based on kind of monitoring and understanding what the adolescent's experience is, what they interact with. Mr. Quote Discord allows pornography on their site. Now 177% of minors who use Discord have reportedly had online sexual interactions on their platform 17% and 10% had those interactions with someone the minor believed was an adult do they restrict minors from um the access to Discord servers that host pornographic material on them?
Senator, yes, we restrict miners from accessing content that is marked for adults um Discord also does not recommend content to people Discord is a chat app, we do not have a feed or an algorithm that drives content, so we allow adults share content with other adults in spaces labeled for adults and we do not allow teenagers to access that content, okay, I see time is up, thank you, welcome everyone, we are here in this audience, because, as a collective, your platforms really They suck at controlling themselves, we found out here in Congress with the trafficking of fennel and other drugs facilitated through the platforms. see it and hear it here in congress with the harassment and bullying that occurs through its platforms we see it and hear it here in congress regarding child pornography, sexual exploitation and blackmail and we are tired of it, it seems to me that There is a liability issue because, in my opinion, these conditions continue to persist.
Section 230, which provides immunity from lawsuits, is a very important part of that problem when you look at where bullies have been taken to heal recently, if Dominion finally gets justice against Fox News. After a long campaign to try to discredit the election equipment manufacturer or whether they are the fathers and mothers of the Sandy Hook victims, they finally get justice against Infowars and their campaign of trying to make people believe that the massacre of their children was a farce by them or even now more recently with a writer receiving a very significant judgment against Donald Trump after years of intimidation and defamation, an honest court has proven to be the place where these things are resolved and I will only describe one case if I may.
It's called Doe versus Twitter. The plaintiff in that case was blackmailed in 2017 for sexually explicit photos and videos of himself when he was between 13 and 14 years old. A video compilation of multiple camera videos appeared on Twitter in 2019. A concerned citizen reported that video on December 25, 2019. On Christmas Day, Twitter took no action, the complainant later, a minor in high school in 2019, he found out about this video from his classmate in January 2020, you are a high school kid and suddenly, that is a day that is difficult to recover from, it finally became. suicidal, he and his parents contacted authorities and Twitter to remove these videos on January 21 and again on January 22, 2020 and Twitter finally removed the video on January 30, 2020 once federal authorities became involved, It's a pretty nasty set of facts and when the family sued Twitter for all those months of refusing to remove this boy's explicit video, Twitter invoked section 230 and the District Court ruled the claim was barred, there's nothing in that set of facts that tells me that section 230 performed any public service in that regard I would like to see very substantial adjustments to section 230 so that the honest courtroom that brought relief and justice to Carol after months of defamation that brought silence peace and justice to the parents of the Sandy Hook children after months of smear and intimidation by Infowars and Alex Jones and brought significant justice and an end to the smear campaign by Fox News to a small business that was busy making election machines, so my time is running out, I'll turn to I.
I guess Senator Cruz will be next, but I would like each of his companies to put in writing what exemptions from Section 230 protection they would be willing to accept taking into account the factual situation on do versus Twitter taking into account the enormous damage that was done to that young man and that family due to the lack of response from this enormous platform for months and months and months and months , think again about what it's like to be a high school kid and have that stuff in the public domain and have the company that's keeping it in the public domain react in such a disinterested way okay, we'll put it in writing for me one two three four five yes, done Senator Cruz, thank you Mr.
President, social media is a very powerful tool, but we are here because all parents I know and I believe that all parents in America are terrified by the garbage that Ed directs at our children. I have two teenagers at home and the phones they have are portals to predators, to cruelty, to bullying, to self-harm and each of your companies could do a lot more to prevent it Mr. Zuckerberg in June 2023 The Wall Street Journal reported that Instagram's recommendation systems were actively connecting pedophiles with accounts that advertised the sale of child sexual abuse material. In many cases those accounts appear to be run by underage children who often use keywords and emojis to advertise illicit material in others. cases the accounts included indications that the victim was being sex trafficked Now I know that Instagram has a team that works to prevent the abuse and exploitation of children online, but what was particularly troubling about the Wall Street Journal expose was the extent to which Instagram's own algorithm promoted victim discoverability for pedophiles searching for child abuse material;
In other words, this material was not just living in the dark corners of Instagram. Instagram was helping pedophiles find it by promoting graphic hashtags including #ped and #Preen sex to potential buyers. Instagram also displayed the following warning screen to people searching for child abuse material that these results may contain. images of child sexual abuse and then gave users two options: get resources or see results anyway Mr. Zuckerberg, what the hell were you thinking? Well, Senator, um, the basic science behind this is that when people search for something that's problematic, it's often helpful rather than just blocking it to help direct them to something that, um, might be helpful for them to get help on. what I understand, get resources in what San universe, is there a link for the C results anyway?
Well, because we could be wrong. we try to activate this warning or we try um when we think there is some chance that the result is okay, you may be wrong let me ask you how many times this text screen was shown I don't know, but you I don't know why you don't know. I don't know the answer to that that comes to mind, but well, you know what Mr. Zuckerberg, it's interesting, he says that he didn't know it from the beginning because I formulated it in June 2023 in a superficial oversight letter and his company declined to respond.
Will you commit right now to answering this question for this committee within five days? how lawyers write statements saying we are not going to respond will you tell us how many times this warning screen was shown? yes or no Senator I will investigate it personally. I'm not okay, so he refuses to answer that. I ask you this, how many times has an Instagram user who received this warning that you're viewing images of child sexual abuse, how many times has that user clicked on see results anyway, I want to see that Senator, I'm not sure if we store it? that, but I'll personally look into this and we'll follow up later and what follow up did Instagram do when you have a potential pedophile clicking I'd like to watch child pornography, what did you do next when that happened Senator, I think An important part of the context here is that any context That we think it's childish sexual Zuckerberg is called a question: what did you do next when someone clicked on you?
They may be getting images of child sexual abuse and they click see results anyway, what was your next step? I may be wrong. Did anyone examine whether this was actually child sexual abuse material? Did anyone report that user? Did someone go andtried to protect that child? What did she do next? Senator, we removed everything we believe to be sexual abuse material on the service and did anyone check to see if it was actually child sexual abuse material? Senator. I don't know about every single search engine we tracked, but did you inform the people who wanted it?
Senator, do you want me to answer your question? To answer the question I'm asking, did you inform the people who clicked see results anyway? That's probably one of the factors we use in reporting, and overall we've informed more people and done more reports like this to call the National Center. of missing exploited children than any other company in the industry, we proactively strive in our services to do this and I believe it's over 26 million reports, which is more than the entire rest of the industry combined, so Mr. Zu, your companies and all social media companies must do much more to protect children.
Okay, Mr. Chu, in the next few minutes I want to address you. Are you familiar with China's 2017 National Intelligence Law, according to which states subpoena all organizations and citizens must support and assist? and cooperate with National Intelligence efforts in accordance with the law and will protect the work of National Intelligence The secrets they know yes, I am familiar with this Tick Tock is owned by Bite Dance is subject to the law for Chinese businesses that b Dan is owner, yes, you will be subject to this, but Tik Tok is not available in mainland China and as a senator, as we spoke in your office, we built the Texas project to put this out of your reach, so the bite dance is subject to the law now under this law. that it says will protect the secrets of national intelligence work that they know forces people subject to the law to lie to protect those secrets is that correct, I can't comment on that, what I said again is that we have to because we have to protect those secrets , No.
Senator, Tik Tok is not available in mainland China, we have moved the data to a US club that is controlled by Bite Dance, which is subject to this law. Now you said before, you said and I wrote this. They have not asked us for any information. the Chinese government and we have never provided it, I'm going to tell you and I told you this when you and I met last week in my office. I don't believe them and I'll tell you the American people don't believe it either if you look at what's on Tik Tok in China, you're marketing science and math videos to kids, educational videos and you're limiting the amount of time kids can be on Tik Tok. , in the United States you are promoting self-harm videos and anti-Israel propaganda to children.
Why is there such a dramatic difference? The senator is simply not accurate. There is a difference between what children see in China and what children see here. Tik Tok Senator is not available in China. It's a separate experience. What I'm saying, but. You have a company that is essentially the same, except it promotes beneficial materials instead of harmful materials. That is not true. We have a lot of science and math content here on Tik Tok. There is too much of him in St. Feed to let me point out. Let me. Let me point out to this Mr.
Chu, there was a report recently comparing hashtags on Instagram to hashtags on Tik Tac Tik Tok and what trends and the differences were surprising, so for something like the Taylor Swift hash or the Trump hash, the researchers they found about two Instagram posts for everyone on Tik Tok that's not a dramatic difference that difference drums jumps to 8 to one for the hash weager and jumps to 30 to one for the hashtag tiet and jumps to 57 to 1 squared # tanan and jumps to 174 to 1 for Hong Kong hash protest why on Instagram people can post a Hong Kong hash protest 174 times compared to Tik Tok what sensorization is Tik Tok doing at the request of the Chinese government none Senator that Anis the analysis is? defective has been discovered by other external sources such as the KO Institute fundamentally some things happen here, not all videos have hashtags, that's the first thing, the second thing is that you can't selectively choose some words within a certain time, why the difference between Taylor Swift and Plaza Tanan What happened in Tianan Plaza?
Senator, there was a massive outcry during that time, but what I'm trying to say is that our users can come and post this freely, why would there be no difference to Taylor Swift or minimal difference? a big difference in Tianan Square or Hong Kong Senator, could you please conclude, Senator? Our algorithm doesn't suppress any content based on simply answering that question, why is there a difference like I said? I think this analysis is flawed, it is cherry picking some. words for some periods, we haven't been in this, an obvious difference of 174 to 1 for Hong Kong compared to Taylor Swift is dramatic.
Senator Blumire, thank you, Mr. President, Mr. Zuckerberg Blumenthal, I'm sorry, thank you, I know you both, that was good enough, Mr. Zuckerberg. uh you know who Antony Davis is, right, yeah, she's one of your top leaders in September 2021, she was global head of security, right, yeah, and you know she appeared before a subcommittee of the Commerce Committee that I chaired in at that time, consumer protection subcommittee. right, yeah, and she was testifying on behalf of Facebook, right, meta, but yeah, it was Facebook then, but meta now, and she told us, and I'm quoting, that Facebook is committed to creating better products for young people and doing everything do everything possible to protect your security and privacy. and well-being on our platforms and she also said that child safety is an area in which we are investing a lot.
End of quote. We now know that statement was not true. We know this from an internal email we received. It's an email written by Nick CLE. I know who is correct, yes, you met him as president of Global Affairs and he wrote you a memo that you received correctly. It was written for you, so I can't see the email, but I'll sure assume that. that you got it right and summarized Facebook's problems, he said quote, we are not on track to succeed on our core issues of wellness, problematic use, bullying and harassment connections, and SSI, which stands for suicidal self-harm, he also said in another memo, quote what do we need. to do more and we are being held back by lack of investment this memo is dated August 28th just a few weeks before that Antony Davis testimony correct I'm sorry Senator I'm not sure what the date of the testimony was well those are the dates in Nick CLE's emails to him were pleading for resources to support the narrative to meet current commitments.
Antony Davis was making promises that Nick CLE was trying to keep and you rejected that request for 45 to 84 engineers to do welfare. or security, we know he rejected it for another memo. Nick C's assistant Tim Curn, who Nick emailed Mark referring to that earlier email to emphasize his support for the package, but lost out to other pressures and priorities. We have done a calculation. that those potentially 84 engineers would have cost meta about $50 million in a quarter when it made $9.2 billion, and yet you didn't meet that commitment in real terms and you rejected that request due to other pressures and priorities, that's a your example. internal do not act document and is the reason we can no longer trust meta and frankly any of the other social networks to grade their own assignments.
The public and, in particular, the parents in this room know that we can no longer trust him. social media to provide the kind of safeguards that children and parents deserve and that is why passing the child online safety law is so important Mr. Zuckerberg, do you believe you have the constitutional right to lie to the Congress, senator? No, but I mean, let's just leave out one thing that I would like the opportunity to clarify for you in a lawsuit filed by hundreds of parents, some in this very room, alleging that you made false and misleading statements about the safety of your children's platform, You argued not only in one pleading, but twice in December and then in January that you have a constitutional right to lie to Congress, do you reject that submission to the court?
Senator, I don't know what presentation you are talking about, but I testify honestly, sincerely and I would like the opportunity to also respond to the previous things that you showed. I have a few more questions and let me ask others who are here because I think it's important to put on the record who will support the child online safety law, yes or no, Mr. quote. um there are parts of the act that we think are great, no it's a yes or no question. I'm going to run out of time, so I guess the answer is no, if you can't answer yes, uh, we, we.
I strongly believe that the national privacy standard would be excellent Mr. Senator Seagull, we strongly support the child online safety law and have already implemented many of its main provisions. Thank you. I appreciate that support along with the support of Microsoft. Mr. Chu Senator, we have some. We can support it in its current form. Do you support it yes or no? We are aware that some groups have raised some concerns. It is important to understand it. I'll take that as a no. Mr. Yak. Miss Yakar. Senator. We support Kos. We will continue to make sure. to accelerate and make sure that we continue to offer a community for teenagers who are looking for that voice Mr.
Zuckerberg, Senator, we support age-appropriate content standards, but could you suggest to some, Mr. Zuckerberg, that you support safety in children's line? Take action, that's public and I'm just asking if you'll support it or not. They are things with nuances. I think the basic spirit is correct. I think the basic ideas in it are correct and there are some ideas I would discuss on how to improve them. Unfortunately, I don't. I don't think we can count on social media as a group or big tech companies to support this measure, and in the past we know that armies of lawyers and lobbyists have opposed it.
We are prepared for this fight, but I am very glad that we have parents. here because tomorrow we will have an advocacy day and the people who really count the people in this room who support this measure will address their representatives and senators and their voices and faces will make a different senator. Schumer has committed to working with me to bring this bill to a vote and then we will have real protections for children and parents online. Thank you, Mr. President, thank you Senator Blumenthal. We have a vote on Senator Cotton. Have you voted on him? and Senator Holly, you haven't voted yet, you're next and I don't know how long the vote will be open, but I'll turn it over to you, thank you, Mr.
President, Mr. Zuckerberg, let me start with you, did I hear you? I said in your opening statement that there is no link between mental health and social media use. Senator, what I said is that I think it's important to look at the science. I know people talk widely about this as if it's something that's already been proven, and I think most of the scientific evidence doesn't support that. Well, let me remind you of some of the science of your own company. Instagram studied the effect of its platform on teenagers. Let me read you some quotes from The Wall Street newspaper's report on this company's investigators. found that Instagram is harmful for a significant percentage of teenagers, especially teenage girls.
Here is a quote from his own study. We cited that we worsen body image problems for one in three teenagers. Here's another quote. The teens blamed Instagram. This is your study for the increases in the rate. of anxiety and depression, this reaction was spontaneous and consistent across all groups, that's your study Senator, we try to understand the feedback and how people feel about services, we can improve a minute, yours D, your own study says you make life worse for one in three teenagers you increase anxiety and depression that's what it says and you're here testifying to us in public that there's no link you've been doing this for years for years you've been coming in public and testifying under oath that there is absolutely no link, your product is wonderful, science is moving at full speed, while internally you know full well that your product is a disaster for teenagers and you continue doing what you are doing well, that is not true, that's not true. let me let me show you some other facts that I know you know wait a minute wait a minute that's not a question that's not a question those are facts Mr.
Zuckerberg that's not a question those are not facts let me show you some other facts more data here are some here is information from a whistleblower who appeared before the Senate testified under oath in public worked for you is a top executive this is what you showed he found when he studied your products so for example these aregirls between the ages of 13 and 15 37% of them reported that they had been exposed to unwanted nudity on the platform in the last s days 24% said they had experienced unwanted sexual advances propositioned them in the last 7 days 177% said they had encountered self-harm content in the last 7 days.
Now I know you're familiar with these stats because he sent you an email laying it all out. I mean, we have a copy right here. My question is. who did you fire for this who did they fire for that Senator we study all this because it is important and we want to improve our services well you just told me Sayo you studied it but there was no link to who you fired yes I told you mischaracterized 37% of the adolescents between 13 and 15 year olds were exposed to unwanted nudity in one week on Instagram, did you know, who fired the Senator?
That's why we're building all the tools, Senator, I don't think that's who did it. fire, I'm not going to answer that, um, because you didn't fire anyone, right, you didn't take any meaningful action, it's appropriate to talk about decisions like the IND, do you know who's sitting behind you? You have families from all over the country. whose children have been seriously harmed or gone missing and you don't think it's appropriate to talk about the actions you took the fact that you didn't fire anyone, let me ask you this, let me ask you this, have you compensated anyone? of the victims I'm sorry, have you compensated any of the victims?
You have made up for these girls. I don't think so, so why don't you think they deserve some compensation for what your platform has done? Help with counseling service Help to deal with the problems that your services caused. Our job is to make sure we build tools to help keep people safe. Are you going to compensate them? Senator. Our job and what we take seriously is to make sure we build industry-leading tools for finding harmful shots. Get out of services to make money and create tools that empower parents. So you didn't take any action. You didn't take any action.
You didn't fire anyone. You haven't compensated a single victim. Let me ask you this. Let me. I ask you this: there are families of victims here today. Have you apologized to the victims? I would like to do it now. Well, they are here. You're on National Television. Would you like to apologize now to the victims who have been harmed by your product? them the photos you would like to apologize for what you have done to these good people I am sorry for everything you have been through terrible, no one should have to go through the things that their families have suffered and that is why we invested a lot and we are going to continue doing industry-leading efforts to make sure no one has to go through the kinds of things their families have had to endure.
You know why, Mr. Zuckerberg, why shouldn't your company be sued? Why can you claim to hide behind a shield of responsibility? You can't be responsible. Shouldn't you be personally responsible? Will you take personal responsibility? Senator. I think I've already answered. Answer this. I mean, these are the attempts. Will Senator take personal responsibility? I see it as my job and our company's job to create the best tools we can to keep our community safe. Well, you are failing at that. Senator, we are making an industry-leading effort. We build AI. tools your product is killing people will you personally commit to compensating the victims you are a billionaire will you commit to compensating the victims you will create a compensation fund money i think these are these are senators they are complicated no, that is not a complicated question yes or no , you will create a compensation fund for the victims with your money, the money you made from these families sitting behind you, yes or no, Senator, I don't think that's my job, it's having good tools, my job is to make sure that Its your job to be responsible for what your company has done, you have made billions of dollars off the people sitting behind them, here, you have done nothing to help them, you have done nothing to compensate them, you have done nothing to express it.
Well, I could do it here today and I should do it, Mr. Zuckerberg, before my time expires. Mr. Chu, let me ask you about your platform. Why shouldn't your platform be banned in the United States of America? You are owned by a Chinese communist company or company. Based in China, the editor-in-chief of its parent company is a Communist Party secretary. His company has been monitoring Americans for years, according to leaked audio of more than 80 internal Tik Tock meetings. Employees of his China-based company have repeatedly accessed non-public data of US citizens. Your company has tracked journalists who gain improper access to user data from your IP addresses in an attempt to identify whether they are writing negative stories about you.
Why should your platform basically be a spy arm for the Chinese Communist Party? Why shouldn't it be banned? Senator of the United States of America I do not agree with your correct characterization. We have explained much of what he said in great detail. Tik Tok is used by 170 million Americans. I know that every single one of those Americans is at risk from the fact that you track their keystrokes, you track their app usage, you track their location data, and we know that Chinese employees who are subject to the dictate of the Chinese Communist Party can access all that information, why shouldn't they be banned in this country?
Senator, that is not accurate, much of what you describe we collected, it is not 100% accurate, do you deny that Bite Dance employees have repeatedly accessed US data in China? We built a project that, you know, cost us billions of dollars. to stop that and we have made a lot of progress and it has not stopped according to the Wall Street Journal report yesterday even now the bite to dance workers without going through official channels have access to the private information of American citizens. citing from the article private information of American citizens, including his date of birth, his IP address and more, he is now a senator, as we know, the media does not always understand well what we have what we have what the Chinese Communist Party does I don't do it What I am saying is that we have spent billions of dollars to build this project, it is rigorous, it is solid, it is unprecedented and I am proud of the work that the 2000 employees are doing to protect the data that is not protected, that is the problem Mr.
CH, you are not protected at all, you are subject to inspection by the Chinese Communist Party and review your application unlike anyone else sitting here and God knows I have problems with everyone here, but Europe's application, unlike any of them, is subject to the control and inspection of a hostile foreign government that has actively sought to track the information and whereabouts of every American they get their hands on. Your application should be banned from the United States of America for the safety of this country, thank you, Mr. President, Senator Herona, thank you, Mr. President Well, as we have heard, children also face types of dangers when using social media, from mental health harm to sexual exploitation, even trafficking.
Sex trafficking is a serious problem in my home state of Hawaii, especially for Native Hawaiian victims who are using social media platforms. facilitating this trafficking, as well as the creation and distortion of Ccam distribution, is deeply concerning, but it is happening, for example, several years ago, a military police officer stationed in Hawaii was sentenced to 15 years in prison for producing c -cam as part of its online exploitation. of a minor, he began communicating with this 12-year-old girl through Instagram, then used Snapchat to send her sexually explicit photos and request those photos. He later used these photos to blackmail her and just last month the FBI arrested FB. a neo-Nazi cult leader in Hawaii who lured victims to his Discord server.
He used that server to share images of extremely disturbing child sexual abuse material interspersed with neo-Nazi images. Members of his hate and child exploitation group are also present on Instagram Snapchat authorities to investigate these criminals, but at the time of the investigation so much damage had already been done. is about keeping kids safe online and we've heard all of her testimonials about seemingly impressive safeguards for young users. Try to limit the time they spend. Needs parental consent. He has all these tools, but he still tries and exploits. Miners online and on their platforms continue to proliferate.
Almost all of their companies make money through advertising, specifically selling the attention of their users. Your product is your users, as one MAA product designer wrote in an email quote. Young people are the best for you. You want to attract young people to your service and hear the final quote early, in other words, hook them. Early research published last month by the Harvard School of Public Health estimates that Snap makes a staggering 41% of its revenue by targeting users under 18 with Tik Tock. 35% Seven of the 10 largest Discord servers that attract many paid users are for G games used primarily by teenagers and children.
All of this is to say that social media companies, yours and others, make money by attracting children to their platforms, but ensuring safety does not. money, it costs money if you are going to continue attracting children to your platforms, you have an obligation to ensure that they are safe on your platforms because the current situation is unsustainable, that is why we are holding this hearing, but to ensure the safety of our children. That costs money, your companies can't continue to profit from young users only to look the other way when those users, our children, are harmed online.
We've received a lot of feedback on Section 230 protections and I think we're definitely headed in that direction. the five bills we've already passed in this committee talk about limiting liability protections for you last November this is for Mr. Zuckerberg last November the privacy technology subcommittee heard testimony from aturo oso in response to a from my questions about how to ensure that social media companies focus more on child safety, he said, he said and I and I'm paraphrasing a little bit. Mr. bhar said, uh, what will change your behavior is the moment Mark Zuckerberg declares earnings and these are these earnings that have to be declared to the SEC so he has to say last quarter we made $34 billion and The next thing you have to say is how many teens experienced unwanted sexual advances on your platform Mr.
Zucker Zuckerberg, will you commit to reporting measurable child safety data in your quarterly earnings? report and call the Senator, that's a good question, we actually already have a quarterly report that we issue and we do a call to answer questions about how we are enforcing our community standards that include not just child safety issues and metrics. Yes, we have a separate call where we do this, but we have led the I think you know you have to report your earnings to the SEC, you report this type of data to them and by the way, in numbers because the senator said and others have said That percentages don't really tell the whole story.
Will you report the number of teenagers to the SEC and sometimes you don't even know if they are teenagers or not because they just claim to be adults? Will you report the number? of underage children on your platforms experiencing unwanted cessation and other types of uh uh messages that harm them, will you commit to citing those numbers to the SEC when you make your quarterly report? Well, Senator, I'm not sure it makes that much sense to include them in the SEC filing, but we presented doing it publicly so that everyone can see this and I'll be happy to follow up and talk about what specific metrics I think the specific things that some of what you just mentioned about underage people on our services we don't allow people under 13 on our service so if we find anyone under 13 we remove them from our service now.
I'm not saying that people don't lie and that there is no one under 13 years old. 13 who are using it but I'm not going to be able to count how many people there are because fundamentally if we identify that someone is a minor we take it out of the I think it's very important that "Get real numbers because these are real human beings, it's because that all these parents and other people are here becauseAny time a person, a young person, is exposed to this type of unwanted material and becomes hooked, it is a danger to that individual, so "I hope you are saying that you report this type of information, if not to the SEC, then go public.
I think I heard that yes, yes, senator. I think we report more publicly on our app than any other company in the industry and we are very supportive of transparency measures. I'm running out of time, Mr. Zuckerberg. , but I will continue with what exactly you report again when meta automatically places youth accounts and you testify about this in the most restrictive privacy and content sensitivity sessions and yet teens can opt out of these safe cars, right? It's not mandatory for them to stay in these settings, they can opt out Senator, yes, we, we teenagers default to a private account so that that way they have a private experience. and restricted, but some teens want to be creators and they want to have content that they share more widely and I don't think that's something that should be banned completely, so why not?
I think it should be mandatory that they not remain in the most restrictive environments, Senator. I think there are a lot of teenagers who create amazing things and I think with proper supervision and nurturing and controls, I think that's I don't think that's the kind of thing that you want to just not allow anyone to be able to do, you want my time is over, but I have to say that there is an argument that everyone contributes to everything that we are proposing and I share the concern that I have about the general limitation of the responsibilities that we offer to all of you and I think that that has to change and it depends on We in Congress make that change. thank you Mr.
President thank you Senator Rono Senator Court Mr. CH let's get straight to the point Tik Tock is under the influence of the Chinese Communist Party no Senator we are a private company okay so you can see that the your parents' bite dance is subject to the 2017 National Security Law that requires Chinese companies to hand over information to the Chinese government and hide it from the rest of the world, you admit that's right, senator, the Chinese question, you admitted it early , any company that does global business that does business. In China you have to follow your local laws, isn't it the case that the dance bites?
It has an internal Chinese Communist Party Committee, as I said, all companies operating in China have developed their local law, so their parent company is subject to the national security law. that requires you to answer to the party, you have your own internal Chinese Communist Party committee, you answer to that parent company, but you expect us to believe that you are not under the influence of the Chinese Communist Party. I understand this concern, Senator, and that's why constructed was a yes or no, okay, but you used to work for Bite Dance, weren't you the CFO of Bite Dance?
That's right in April 2021, while you were CFO, the Chinese Communist Party's Internet and China investment fund bought a 1% stake in Bitde Dan's main Chinese subsidiary, technology company Bitde Dan, in exchange of the so-called 1% golden share, the party took one of the three positions on the board of directors of that sub-subsidiary. That's right, isn't it? It's for the Chinese business. that's right for the Chinese, that deal was closed on April 30, 2021, isn't it true that you were named CEO of Tik Tok the next day, May 1, 2021? Well, this is a coincidence, it's a coincidence that you were the CFO. that the Chinese communist party took their golden share of your board seat and the next day you were named CEO of Tik Tok, it's a big coincidence, he's really a senator, yeah, okay and before the B dance you were in a chinese dance company called ji is that right uh yeah I used to work where you lived when you worked in XI I lived in China like many where exactly in Beijing in China how many years did you live in Beijing Senator I worked for there? about five years so you lived there for five years yes, isn't it true that xiaoi was sanctioned by the US government in 2021 for being a communist Chinese military company?
I'm here to talk about Tik Tok, I think I had a lawsuit then and it was dropped. I don't remember the Biden administration revoking those sanctions, just like how they reversed the terrorist design against the hotties in Yemen, how is it working for you? But it was sanctioned as a Chinese communist army. company, so you said today, as you usually say, that you live in Singapore, which nation are you a Singaporean citizen of? Are you a citizen of any other nation? Has no senator ever applied for Chinese citizenship? Senator, I served my nation in Singapore, no I did not.
Do you have a Singapore passport? Yes, I served in the army for two and a half years in Singapore. Do you have any others? You have any other passports from other nations. No, his wife is a US citizen. His children are American citizens. That's right. Have you ever applied for US citizenship? No, not yet. Okay, have you ever been a member of the Chinese Communist Party? Senator I am a Singaporean No, have you ever been associated or affiliated with the Chinese Communist Party? No Senator again I'm a Singaporean Let me ask some hopefully simple questions that you said earlier in response to your question that what happened in Tenman Square in June 1989 was a mass protest.
Did something else happen at Tenan Square? Yes, I think it is well documented that there was a massacre, there was an indiscriminate massacre of hundreds or thousands of Chinese citizens, do you agree with the Trump administration and the Biden administration that the Chinese government is committing genocide against the weakest people? Senator. I have said this before. I think it's really important that anyone who cares about this issue or any issue can freely put it simply, it's a very simple question that unites both parties in our country and governments around the world: is the Chinese government committing genocide against the weakest town?
Senator, anyone, including you, knows you can come talk about this. I ask you, you are a worldly, cosmopolitan and well-educated man who has expressed many opinions on many topics: the Chinese government is committing genocide against the weakest people. Actually, senator. I mainly talk about my company and I'm here to talk about what it does, yes or no. Are you here. We allow you, you are here to give a testimony that is truthful, honest and complete let me ask you this Joe Biden last year said that Ping was a dictator do you agree that Joe Biden is xan Ping a dictator Senator I'm not going to comment?
About any world leader, why don't you answer these simple questions? Senator, it is not appropriate for me, as a businessman, to come in and the world to lead this scar. Are you afraid of losing your job if you say something negative about the Chinese Communist Party I don't agree with that, you will find content that is critical of China next time you go, are you afraid of getting arrested and disappearing next time you go a Senator from mainland China? You will find content that is critical to China. China and every other country freely on Tik Tok, okay, okay, let's move on to what Tik Tok, a tool of the Chinese Communist Party, is doing to America's youth.
Does the name Mason Edens ring a bell? Senator, you may have to concede. Give me more details if you don't mind, yes he was a 16 year old Aran after a breakup in 2022, he went on your platform and searched for things like inspirational quotes and positive affirmations, instead they showed him numerous videos glamorizing the suicide until he killed himself with a gun what's up with the name Chase Nasca? Sounds familiar, would you mind giving me more details? Please, he was a 16 year old young man who watched over a thousand videos on his platform about violence and suicide until he took his own. life by standing in front of a plane or a train, are you aware that his parents Dean and Michelle are suing Tic Toc and Bite Dance for pushing their son to take his life?
Yes, I am aware of that. Well, finally Mr. Chu, um, has the Federal Trade Commission sued Tik Tock during the Biden administration, uh, Senator, I can't speak to whether there is any, are you currently being sued by the Federal Trade Commission ? Senator. I can't speak to any potential lawsuits, say actual potential, are you being sued by the federal government? Senator of the Commerce Committee I think I have given you my answer. Can. No. Ms. Jino's company is being sued. I believe Mr. Zuckerberg's company is being sued. I believe that Tick Tock, the agent of the Chinese Communist Party, is still not being sued by the Biden administration.
Is he familiar with the name Christina Kafar? Maybe you have to give me more details. Kafara was a consultant paid to bite the bullet for her communist-influenced parent company. She was then hired by Biden's FTC to advise on how to sue Mr. Zuckerberg's company. Senator B is a global company, not the Chinese communist company, public reports indicate that his lobbyists visited the White House more than 40 times in 2022. How many times did he visit his company? His company's lobbyist visited the White House last year. I don't know, sir, is that you? Are you aware that the Biden campaign and the Democratic National Committee are on your platform?
They have Tik Tok accounts. Senator, we encourage people to come. They won't let them. They will not allow their staff to use the personal phones they are given. separate phones where they just use Tik Tok we encourage everyone to join including yourself senator so all these companies are being sued by the FTC you are not the FTC you have a former paid advisor your parents are talking about how they can sue Mr. Zuckerberg's company, Joe Biden. re-election campaign and the Democratic National Committee is on your platform, let me ask you, have you or anyone else on Tik Tock communicated or coordinated with the Biden Administration, the Biden campaign, or the Democratic National Committee to influence the flow of information on your platform?
We work with anyone, any creator that wants to use our campaign, it's the same process that we have, so what we have here is a company that is a tool of the Chinese Communist Party that is poisoning the minds of American children in some cases. driving them to suicide and that, at best, the Biden administration is overlooking, at worst, it may be in collaboration with thank you, Mr. Chu, thank you, Senator Cotton, so let's take a break Now that we are at the second roll call. You can take advantage, you want the break to last about 10 minutes, please do your best to come back, guys.
The Senate Judiciary Committee will resume. We have nine senators who have not yet asked questions in seven-minute rounds and we will give the floor to Senator Pia first, thank you. You, Mr. President, colleagues, as we reconvene, I am proud once again to share that I am one of the few senators with younger children, and I say this because as we have this conversation today, it is not lost on me. That among my children, who are now in the teen and tween category, and their friends, I see this issue very up close and personal, and in that spirit I want to take a second to recognize and thank all the parents. who are in the audience today, many of whom have shared their stories with our offices and I give them credit for finding strength through their suffering through their struggle and channeling it into advocacy that is making a difference.
I thank you all, now I thank you. I again personally appreciate the challenges that parents and caregivers, school staff and others face, in helping our young people navigate this world of social media and technology in general, now that the services our children are using growing up gives them unparalleled access to information, I mean. This goes beyond what previous generations have experienced and that includes opportunities for learning, socialization and much more, but we also have a lot of work to do to better protect our children from predators and the predatory behavior that these technologies have enabled and Yes, Mr.
Zuckerberg. That includes exacerbating the mental health crisis in America. Almost every teenager we know has access to smartphones and the Internet, and uses the Internet daily, and while Guardians have the primary responsibility of caring for our children, the old saying goes that it takes a village. and therefore, society as a whole, including technology industry leaders, must prioritize the health and safety of our children. Now delve into my questions now and get specific, platform by platform, Witness by witness on the topic of some of the parenting tools that each of you has referenced. to Mr. C how many minors there are on Discord and how manymany of them have caregivers who have adopted their family center and if you don't have the numbers, say so quickly and send them to our office.
We can follow up with you on how it has beenMake sure the young people in your gardens know the tools you offer. We make it very clear that teens on our platform should use what tools are available and our teen security enabled by default. What do you do specifically? What may be clear to you is not. The general public, then, what in your opinion does it do to make it very clear? So, our Teen Safety Assist, which is a feature that helps teens stay safe, as well as blocking and blurring images that may be sent to them.
By default for Te accounts and it cannot be disabled, we also have marketing and for our teen users directly on our platform, we launch our family center, create a promotional video and place it directly on our products so that when all teens open. In fact, the app, every user opened the app and got an alert like, Hey, Discord has this. They want you to use it. Thank you. We look forward to the data we are requesting for Zuckerberg across all Instagram, Facebook Messenger and Horizon meta services, how many miners use your apps and of those miners how many have a caregiver who has adopted the parental monitoring tools you offer?
Sorry, I can go on with specific statistics on that. Well, it would be very useful if not just us knew. but just so you know as a leader of your company, and the same question, how do you make sure that young people and their guardians are aware of the tools that you offer? We carry out quite extensive advertising campaigns both on our platforms and outside where we work. with creators and organizations like Girl Scouts to make sure that this is in general terms and that there is a broad understanding of the tools, okay? Mr. Spel, how many miners use Snapchat and of those minors, how many have caregivers who are registered with their family center?
Senator, I think approximately uh in the United States there are approximately 20 million teenage Snapchat users. I think about 200,000 parents use Family Center and about 400,000 teens have linked their account to their parents using Family Center, so 200,000 400,000 sounds like a big number but a small percentage of kids using SnapChat, what are you doing to ensure that the young people in their Guardians know the tools they offer? C. We create a banner for Family Center in the users' profile so that the accounts we create have the age at which they could be parents. I can easily see the entry point to the Family Center, okay Mr.
Chu, how many minors there are on Tik Tok and how many of them have a caregiver who uses their family tools. Senator, I need to get back to you on the specific numbers, but We were one of the first platforms to provide what we call Family Matching. To parents, go to settings. They activate a QR code. You and your teen scan your QR code and what it allows you to do is set screen time limits. You can filter some keywords, you can activate the most restricted mode and we are always talking to parents. I met with a group of parents, teens, and high school teachers last week to talk about what else we can offer. the family pairing mode Miss yako, how many minors use x and are you planning to implement safety measures or guidance for caregivers like uh, your pure companies appreciate it, Senator?
Less than 1% of all users are between 13 and 17 years old. 1% of the 90 million American users are fine, so still hundreds of thousands continue yes, yes, and each one is very important. As a 14-month-old company, we have reprioritized safety and child protection measures and have only just started talking. about and discuss how we can improve those with parental controls, let me continue with the follow-up question for Mr. Citron, in addition to keeping parents informed about the nature of various Internet services, there is much more that we OBs must do for the purposes today, while many companies offer a wide range of user empowerment tools, quote, it is useful to understand if young people find these tools useful, so I appreciate you sharing your teen safety assistance and tools and how you advertise them, but have you done any evaluation of how?
These features are affecting minor usage of your platform. Our intention is to provide teens with tool capabilities that they can use to keep themselves safe and also so that our teams can help keep teens safe. We recently launched teen safety assistance last year and I. I don't have a study in mind, but we'd be happy to follow up with you. Okay, I'm out of time. I will have follow-up questions for each of you, either in the second round or through statements for the record of a similar evaluation of the tools that you have proposed thank you Mr.
President thank you Senator P Senator Kennedy thank you all for being here uh Mr. Spiegel I'll see him hidden down here what does yada yada and me mean? I am not familiar with the term Senator. Very unattractive, can we agree that what you do, not what you say, what you do, is what you believe and everything else is just cottage cheese? Yes, Senator, you agree with that, speak up, don't be shy. I've listened to you today I've listened to a lot of yachts Y ying and I've heard you talk about the renovations you've done and I appreciate them and I've heard you talk about the renovations you're going to do but I don't think you're going to solve the problem.
I think Congress will have to help you. I think the renovations you're talking about to some extent will be like putting paint on rotten wood and I. I'm not sure you'll support this legislation. I'm not. The fact is that you and some of your Internet colleagues who are not here are no longer here. They are not companies. Their countries. They are very powerful and you and some of your colleagues who are not here have blocked everything we have tried to do in terms of reasonable regulation, from privacy to child exploitation and, in fact, we have a new definition of recession. recession is when we know that in a recession Google has to fire 25 members of Congress, that's what we've come down to, we've also come down to this fact that their platforms are harming children.
I'm not saying they aren't doing something. good things but they are hurting children and I know how to count votes and if this bill reaches the floor of the United States Senate it will pass what we are going to have to do and I say this with all the respect I can Muster is convinced of Let my good friend Senator Schumer go to Amazon, buy a tenderloin online and bring this bill to the Senate and then the House will pass it. That's one person's opinion. I may be wrong, but I doubt it, Mr. Zuckerberg. let me ask you a couple of questions that might get a little philosophical here, um, I have to admit, uh, you have, um, you've convinced over two billion people to reveal all of their personal information in exchange for getting to see what your high school friends had dinner on Saturday night, that's pretty much your business model, right?
That's not how I would characterize it and we give people the chance to connect with the people they care about and, um, and interact with people. topics that interest them and you and you take this information, this abundance of personal information and then develop algorithms to press the hot buttons of the people who send them and target information that presses their hot buttons over and over again to keep them coming back. go back and make them stay longer and as a result your users only see one side of an issue and to some extent your platform has become a killing field for the truth, hasn't it?
I mean, senator, I don't agree with that characterization, um, you. We know we create rankings and recommendations because people have many friends and many interests and they want to make sure they see content that is relevant to them. We're trying to create a product that's useful to people and make our services as useful as possible for people to connect with the people they care about and the interests they care about, but it doesn't show them both sides, no. it gives them balanced information, you just keep pressing your hot buttons, pressing your hot buttons. he doesn't show them balanced so that people can discern the truth for themselves and he speeds them up so much that very often his platform and that of others become puddles of sarcasm where no one learns anything, right?
Senator, I do not agree. With that, I think people can participate in the things that interest them and learn a little about them. We've done a number of different experiments and things in the past around news and trying to show content in, you know, a diverse set of perspectives, I think there's more things that need to be explored there, but I don't think we can figure that out for us. themselves. One of the things I saw in him. I'm sorry to interrupt you, Mr. President, but I'm going to run out of time. Do you think your users really understand what they're giving you, all their personal information and how you process it and how you monetize it?
Do you think people really understand? Senator, I think people understand the basic terms. I mean, I think there is, in fact I think a lot of people put it another way, it's been a couple of years since we talked about this, does your user agreement still suck? I'm not sure how to answer that. Senator, can you still have a dead body and all that legal mail where no one can find it? Senator, no, I'm not quite sure what you mean, but I think people understand the basic deal of using. These Services are a free service that you are using to connect with the people you care about.
If you share something with other people, other people will be able to see your information. It is inherent to you to know if you are posting something to share. publicly, um, or with a private group of people, you know inherently you're putting it out there, so I think people understand that basic part of how this, but Mr. Zuckerberg, you're in the hills of creepy, you track, you track, you track people. who aren't even Facebook users, you track your own people, your own users, who your product is even when they're not on Facebook. I, I mean, I'm going to land this plane pretty quickly, Mr.
President. I mean, it's creepy. and I understand that you make a lot of money doing it, but I wonder if our technology is greater than our Humanity. I mean, let me ask you this last question. Instagram is harmful to young people, isn't it? Senator, I don't agree with that. What the research shows as a whole doesn't mean that individual people don't have problems and that there aren't things we should do to help provide the right tools for people, but in all the research we've done. done internally I mean, this is the survey, you know, that the senator cited earlier, you know, there are 12 or 15 different categories of harm that we asked teenagers if they felt that Instagram made it worse or better and in all of them, except for the one that that um that Senator Holly cited um more people said that using Instagramu are positive or leave me we just have to agree to disagree if you believe in Instagram I know I'm not saying it's intentional but yeah I agree that Instagram, if you believe that Instagram is not harming millions of our young people, particularly young teenagers, particularly young women, you should not be driving, it's thank you, Senator Butler, thank you, Mr.
President, and thank you to our panelists who have come to uh have an important conversation with us, most importantly, I want to thank the families, uh, who have come together to continue to be remarkable champions for their children and their loved ones, for being here, in particular, two families from California, um, that I could. To speak to the families of Sammy Chapman of Los Angeles and Daniel Perera of Santa Clarita, they are here today and they are doing incredible work to not only protect the memory and legacy of their children, but the work that they are doing is going to protect to my 9 year old son and that is, in fact, the reason we are here.
There are a couple of questions I want to ask a few people, let me start with a question for each of you. Mr. Citron, have you ever sat down with a family and talked about their experience and what they need from your product? Yes or no. Yes, I have talked to parents about how we can create tools to help them. Mr. Spiegle, have you sat down with families and youth? let people talk about your products and what they need from your product, yes, Senator, Ms. Shu, yes, I did it two weeks ago, for example, I don't want to know what you did for the preparation of the hearing, Mr.
Chu, just I wanted to know if Examine anything in terms of Prov designing the product you're creating, Mr. Zuckerberg. Have you sat down with parents and youth to talk about how you design the product? For its constitution for itsconsumers? Yes, over the years, I've had a lot of conversations with parents, you know, that's interesting, Mr. Zuckerberg, because we talked about this last night and you gave me a very different answer. I asked him this same question, well, I told him no, I didn't know what specific processes. our company didn't have Mr. Zuberg, you told me it didn't.
I must have spoken wrong. I want to give you a chance to trash talk, Mr. Zuckerberg, but I asked you this very question. I asked you all this question, uh, and you gave me a very different answer when we spoke, but I'm not going to insist, can you, several of you have talked about sorry ex, Miss Sharino, have you spoken to the parents directly with young people, but about designing your product as a new? with them, thank you Mrs. Mr. Spiegel, there are several parents whose children have been able to access illegal drugs on your platform, what do you say to those parents?
Well, Senator, we're devastated because we can't, to the parents, what do you say to those parents? Mr. Spiegel, I am very sorry that we have not been able to prevent these tragedies. We work very hard to block all drug-related search terms on our platform. We proactively search for and detect drug-related content. We removed it from our platform. We keep it as evidence and then forward it to the authorities. Uh for Action has worked alongside nonprofits and families on educational campaigns because the scale of the fenel epidemic is extraordinary. More than 100,000 people lost their lives last year and we think people need to know that a pill can kill.
That campaign went further. out of 200 was viewed over 260 million times on Snapchat us too Mr. President there are two parents in this room who lost their children they are 16 years old their children were able to get those Snapchat pills I know there are statistics and I Know there are good efforts, none of those efforts prevent our children from having access to those medications on their platform. As a California company, y'all. I've talked to you about what it means to be a good neighbor and what California families do. and American families should expect from you, you owe them more than just a set of statistics, uh, and I hope you show up in every part of this legislation, you all show up in every part of the legislation to keep our children safe. , Mr.
Zuckerberg. I want you back. I talked to you about being a father to a little boy who doesn't have a phone, don't you know, he's not on social media at all and one of the things that I'm deeply concerned about as a father of a young black girl is the use of filters. on their platform form that they would suggest to young women using their platform evidence that they are not good enough as they are. I want to ask more specifically and refer. to some unredacted court documents revealing that their own researchers concluded that these plastic surgery-mimicking face filters negatively impact the mental health and well-being of young people, why should we believe?
Why should we believe that? Because you are going to do more. to protect young women and girls when you give them the tools to affirm the self-hatred that is spewed on your platforms, why should we believe that you are committed to doing anything more to keep our children safe? There's a lot to unpack, people have tools to express themselves in different ways and people use face filters and different tools to create media, photos and videos that are funny or interesting in many of the different products that are plastic surgery pins. good tools to express creativity um Senator, I'm not talking about skin lightening tools being tools to express creativity.
This is the direct thing I'm asking. I'm not defending any specific ones. I think the ability to filter and edit images is generally a useful tool for expression, specifically for that. I'm not familiar with the study you're referring to, but we made it so we don't recommend this type of content. team I did not make any reference to a study to court documents that revealed their knowledge of the impact of these types of filters on young people, in general, girls and in particular, I do not agree with that characterization. I think there have been document hiccups.
I haven't seen any document that says, "Okay, Mr. Zuckerberg, I'm out of time. I hope you listen to what's being offered and are prepared to step up and do better. I know this Senate committee is going to do our job to keep you and great for Greater account thank you Mr. President Senator Tillis thank you Mr. President thank you all for being here um I don't feel like I'm going to have the opportunity to ask many questions so I'm going to I reserve the right to send some for the record, but I have heard that we have had hearings like this before being in the Senate for nine years.
I have heard hearings like this before. I have heard horrible stories about people who have died, who have committed suicide, who have. been embarrassed, every year we have an annual flogging every year and what has materially happened in the last nine years, are any of you, just no question, are any of you involved in an industrial consortium that is trying to make this happen? fundamentally safe on all platforms, yes or no, Mr. Zuber, there are a variety of organizations that you are involved in, what organizations should I say, is anyone here not involved in an industry?
Yes, I really think it would be immoral for all of you to consider it a strategic advantage. secure or to keep something private that would protect all of these platforms to prevent the kind of things that are you all okay with that? Anyone would say they want ours because ours is the safest and they haven't discovered the secret. The industry realizes that this is an existential threat to all of you if we don't get this right. I mean, they have to protect their platforms, they have to deal with this, don't they have an inherent mandate to do this because it would seem?
For me, if you don't do it, you will cease to exist. I mean, we could regulate you out of business if we wanted to and the reason I say it may sound like a criticism, it's not a criticism. I think we have to do it. I understand that there must be an inherent motivation for you to do this well or Congress will make a decision that could potentially put you out of business. This is the reason. I have concern with that, although I just connected to the Internet while listening carefully. all the other members talked and I found a dozen different platforms outside of the United States, 10 of which are in China, two of which are in Russia, their average daily subscribers or active membership numbers are in the billions, Well, people say you can't get it.
On the Chinese version of Tik Tok, I did a quick search on my favorite search engine to find out exactly how I could get an account on this platform today. And the other thing we must take into account. I come from technology. You could discover, ladies and gentlemen, you could discover how to influence your children without them being on a social media platform. I can send random text messages and get a bite and then find an email address and get compromising information, um, if it's me. It's horrible to hear some of these stories and I've shared them and these stories happened in my hometown in North Carolina, but if we just come here and make a point today and don't start focusing on making a difference that requires people to stop shout and start listening and start passing language here the bad actors are just going to be off our shores.
I have another question for all of you, how many do, how many people approximately if you don't know exactly the numbness? Approximately how many people do you have looking at these horrible images 24 hours a day and you just quickly respond and filter them? Most of the 40,000 are people who work in security and again we have 2,300 people. around the world, okay, we have 40,000 trusted and safe professionals around the world. We have approximately 2,000 people dedicated to trust, security and content moderation. Our platform is much smaller than these people. We have hundreds of people and they are watching. contained in 50% of our work, I have only mentioned that these people have a horrible job, many of them experiment, they have to seek counseling for all the things they see, we have evil people out there and we are not going to fix it. this by yelling or talking to each other we're going to fix this if each of you are at the table and hopefully getting closer to what I heard one person say supporting a lot of the good bills like one I hope Senator Blackburn mentions when You'll get a chance to speak, but guys, if you're not at the table to secure these platforms, you're going to be at the table and the reason I don't agree with that is that if we ultimately destroy your ability to create value and put you out of business, evil people will find another way to get to these children and I have to admit I don't think my mom is seeing this, but there are good things that we can't ignore.
Well that's happening, my mom who lives in Nashville, Tennessee, and I spoke yesterday and we talked about a Facebook post she made a couple of days ago, we won't let her talk to anyone else about it that connects my mom from 92 years with uh. with her grandchildren and great-grandchildren that allows a child who may feel uncomfortable at school to join a group of people and interact with people, let's not discard the good because we have not all concentrated together on eradicating the bad now. I guarantee you that I could go through some of your governing documents and find a reason to confuse each of you because you didn't put the emphasis on it that I think you should, but at the end of the day, I DO find it difficult I think any of you started This business, some of you in your college dormitories with the purpose of creating the evil that is being perpetrated on your platforms, but I hope that every waking hour you are doing everything you can to reduce it.
We're not going to be able to eliminate it and I hope there are some entrepreneurial young tech people today who will go to parents and tell them, ladies and gentlemen, that your children have a deadly weapon, they have a potentially deadly weapon, whether it's a phone or a tablet, you have to make sure, you can't assume that they are going to be honest and say they are 16 when they are 12, we all have to recognize that we have a responsibility to play and you are at the tip of the spear, so I hope we can get there to a point where we are moving these bills if they have a problem with them.
Tell us your problem, let's solve it. No, it's not an answer, uh, and know that I. I want America to be the beacon of innovation, the beacon of security and stop people from using other options that have been around since the Internet existed to exploit people and count me in as someone who will try to help, thank you Mr. President, Thank you Senator Tillis, next up is Senator. Thank you, Mr. President, and thank you to our witnesses today, Mr. Zuckerberg. I want to start by just asking a simple question: do you want kids to use your platform more or less well?
I do not want people under 13 years of age to use their platform. He wants teenagers ages 13 and up to use his platform more or less. Well, we would like to create a product that is useful and that people want to use. My time will be limited. So do you want them to use it more or less? Adolescents from 13 to 17 years old. Do you want them to use meta products more or less? I wish they were useful enough that you would want to use them more. more, I think we have one of the fundamental challenges here: you actually have a fiduciary obligation not to try to get children to use your platform more, it depends on how you define that, obviously we are a business, it's I'm Sor, Mr.
Zuckerberg, it's At the moment, it is not obvious that you have a fiduciary obligation to ensure that your users, including those under 18 years of age, use and interact with your platform more correctly in the long term, rather than less, but in the short term. , we often took many measures, including the fact that we made a change to display fewer videos than those on the platform, which reduced the amount of time by more than 50 million hours. Okay, but if your shareholders ask you, Mark, you wouldn't do it, Mr. Zuckerberg. here, but your shareholders may be on a first-name basis with you, brand, are you trying to get kids to use meta products more or less?
I would say more like, well, I would say that in the long term we are trying to create as many products as possible. let's look at the 10K you file with the SEC a few things I want to point out here are some quotes and this is a filing you sign correctly yes our financial performance has been and will continue to be significantly determined by our success in adding retention and engaging active users here there is another appointment yesour users decrease their level of engagement with our products our revenue and financial results in the business may be significantly harmed here is another quote we believe that some users, especially younger ones, are aware of and actively participate in other products and services similar to as a substitute for the Our continued In the event that users increasingly interact with other products and services, we may experience a decline in usage and engagement in key demographic groups or more broadly, in which case our business would likely be harmed. you have an obligation as chief executive to encourage your team to get kids to use your platform more Senator, this is not self-evident, you have a fiduciary obligation to your shareholders to get kids to use your platform more.
I think what is not intuitive is the direction. It's making products more useful so people want to use them more. We don't give our teams running the Instagram feed or the Facebook feed the goal of increasing the amount of time people spend, yes, but it's not distributed. and you and your 10K make it clear that you want your users to engage more and use the platform more and I think this gets to the root of the challenge because it's the overwhelming view of the public, certainly in my home state of Georgia, uh, and we've had some discussions about the underlying science that this platform is harmful to children.
I mean, you're familiar with her and not just her platform, by the way. The Surgeon General's 2023 General Media Report on the Impact of Social Media on Children's Mental Health which cites evidence that children who Spending more than three hours a day on social media have twice the risk of poor mental health outcomes, including depression and anxiety. I am familiar with the Surgeon General's report on the underlying study. I have read the report. Yes, do you question it? No, but I think it's important to characterize it correctly. I think what I was pointing out in the report is that there seems to be a correlation. and obviously the issue of mental health is a very important one, so it's something that needs to be looked at.
The thing is, everyone knows that there is a correlation, everyone knows that children who spend a lot of time on their platforms are at risk and it's not just mental health issues, I mean, let me ask you another question, do you Is the platform safe for children? I think it is, but there's a difference between correlation and causation because we're not going to be able to get anywhere we want to work. productive, open, honest and collaborative way with the private sector to pass legislation that protects Americans, that protects American children above all, and that allows businesses to thrive in this country if we don't start with an open, honest, sincere evaluation and realistic of the problems.
We can not do that. The first point is that you want children to use the platform more; In fact, you have an obligation to do so, but if you are not willing to recognize that it is a dangerous place for children, the Internet is a dangerous place for children, not only. your platform, isn't the Internet a dangerous place for children? I think it can be. Yes, there are wonderful things people can do and there are harms we need to work on. Yes, it is a dangerous place for children. There are families. here who have lost their children there are families across the country whose children have self-harmed who have experienced low self-esteem who have been sold deadly pills on the Internet The Internet is a dangerous place for children and its platforms are dangerous places for children.
Do you agree? I think there are damages that we must work to mitigate. Well, I'm not going to do it. I think in general, why not? Why not just acknowledge it? Why do we have to be very careful? I don't agree with the characterization. that the Internet is a dangerous place for children. I think you're trying to characterize our products as inherently dangerous and I think, inherently or not, your products are places where children can experience harm, can experience harm to their mental health. drugs can be sold to them, predators can pray for them, you know they are dangerous places and yet you have an obligation to promote children's use of these platforms and look at it all, that's all I'm trying to suggest .
For you, Mr. Zuckerberg, and my time is running out, is that for you to be successful, you and your colleagues here have to recognize these basic truths, we have to be able to present ourselves to the American people, the American public, the people of my country. . state of Georgia and recognize that the Internet is dangerous, including its platforms, there are predators lurking, drugs are being sold, there are mental health harms that are taking a huge toll on children's quality of life, and yet you have this incentive, not just you, Mr. Zuckerberg, everyone. Of you have an incentive to maximize use and participation and that is where public policies must intervene to ensure that these platforms are safe for children, so that children do not die, so that children do not overdose, so that children don't cut themselves or commit suicide because they spend all day surfing instead of playing outside and I thank you all for your testimony.
We will continue to participate as we develop this legislation. Thank you, Senator from Tennessee. Thank you, Mr. President. Thank you to each of you for coming and I know some of you had to be here to get here but we appreciate you all being here Mr. Chu I want to come to you first. We've heard that you're considering setting up headquarters in Nashville and also in Silicon Valley and Seattle, and what you'll probably find is that the welcome mat won't be rolled out for you in Nashville like it would in California; there are a lot of people in Tennessee who are very concerned about the way that Tik Tock is basically creating files on our children the same way they are building them on their virtual selves and also that that information is being kept in China in Beijing, like you responded to Senator Blumenthal and me last year in reference to that question and we also know that A major record label said yesterday that they were removing all of their content from their site due to their issues with payment for artificial intelligence and the negative impact on health mentality of our children, so we will see how it progresses.
Mr. Zuckerberg, I want to come. For you, we just received Senator Blumenthal and I, of course, have received some internal documents and emails. One of the things that really concerned me is that you referred to your young users in terms of their lifetime value of about $270 per teen and every one of you should be looking at these kids, these are t-shirts that they wear to say today they say that I'm worth over $270 we have some standing in those shirts now and some of the kids in our state some of the kids the parents we've worked with just to think if it's Becca Schmidt David mollik Sarah flat an SCH would you say life It's only worth $270, what could I get you?
I mean, I hear I know you're a dad, I'm a mom, I'm a grandma and how can you even have that thought? It's amazing to me and I think this is one of the reasons why 42 states are now suing you. due to the features they consider addictive that you are pushing and in the emails we received from 2021 that go from August to November is the staffing plan that is being discussed and Antony Davis Nick C Cheryl Sandberg Chris Cox Alex Schulz Adam misseri are all in this email chain about the wellness plan and then we got to one.
Nick emailed Mark to emphasize his support for the package, but it appears he lost out to other pressures and priorities. Look this. is what bothers us children are not your priority children are your product children you see them as a way to make money and children protecting children in this virtual space you made a conscious decision even though Nick CAG and others were going through the process of saying this is What we do with these documents is really enlightening and just shows me that growing this business expands your income. What I was going to put in those quarterly presentations.
That was the priority. The children were not there. It's very clear. I want to talk to you. the pedophile ring because that came up before and the Wall Street Journal reported on that and one of the things we discovered was that after that became apparent, you didn't remove that content and it was content that showed teenagers being for sale. and they were being offered to older men and you didn't remove it because it didn't violate your community standards. Do you know how often a child is bought or sold for sex in this country every two minutes every two minutes a child is bought? or sold for sex that's not my statistic it's a TBI statistic now finally this content was removed after a congressional staffer went to meta global security chief so could you please explain to me and all these parents why explicit predatory content doesn't violate your platform's terms of service or your community standards Sher Senator, let me try to address all the things you just said that violate our standards, we worked very hard to remove it, remove it, we have reported , I think it's over 26 million examples of this type of content removed it until a congressional staffer brought it up.
It may be that in this case we have made a mistake and missed something. We make many mistakes. We lead teams that I want to talk about. with you about your Instagram creator program and about the push that we discovered through these documents that you're actually pushing because you want to bring kids in early, you see that these younger tweens are valuable, but an untapped audience that cites the emails and suggests that teenagers are actually home influencers bringing their younger siblings to their platform on Instagram. Now, how can you make sure your product's Instagram creators? Your program does not facilitate illegal activities when it does not remove content related to the sale of miners again and this happens once in two. minutes in this country, senator, our tools to identify that type of content or be leaders in the industry, that does not mean that we are perfect, there are definitely problems that we have, but we continue, yes, there are a lot of things that are slipping away, it seems that you "You're trying to be the top sex trafficking site Senator Senator that's ridiculous, isn't that ridiculous, you want to turn around and want this content on our platforms, why don't you take it down?
We're here discussing how to work with us, not you". It's not like that, it's not and the problem is that we've been working on this. Senator Welch is there. We've been working on this stuff for a decade. You have an army of lawyers and lobbyists who have fought us every step of the way. the way you work with net Choice, the KO Institute Taxpayer Protection Alliance, and the Progress Chamber to really fight our bipartisan legislation to keep kids safe online, so will you stop funding these groups? Will you stop lobbying against this and come to the table and work with us, yes or no senator, we have a yes or a no, of course, we will work with you down the stretch, okay, the door is open, we have all these bills you need, each and every one of them must come to the table.
Many of you must come to the table and work with us. The children are dying. Senator Welch. I want to thank my colleague, Senator Blackburn, for his decade of work on this. In fact, I have some optimism. Today there is a consensus that he did not do it. exist, let's say 10 years ago there is a profound threat to children, mental health and safety, there is no dispute that was in debate before, that is a starting point, secondly, we are identifying concrete things that can be do in four different areas, one is in industry. standards two are legislation, three are courts and then the fourth is a proposal that Senator Bennett, Senator Graham, myself and Senator Warren have to establish an agency, a government agency whose responsibility would be to participate in this in a way systematic and regular with the right resources and I just want to review them.
I appreciate the industry standard decisions and the steps that you've taken in your companies, but it's not enough, and that's what I think you're hearing from my colleagues, like where are there layoffs and is it in the trust programs and verification, that's alarming because there seems to be a reduction in emphasis on protecting things like you just added M yako 100 employees in Texas in this category, and how many have I had before, the company is going through a major restructuring, for which we have increased the number of trusted and safe employees and agents worldwide by at least 10% so far in the last 14 months and will continue to do so specifically. in Austin, Texas, okay, Mr.
Zuckerberg, as I understand it, there have been layoffs in that area and alsothere are jobs added there on Twitter, but, uh, meta, have there been reductions across the board, not really focused on that area? I think our Our investment is relatively constant over the last few years. We invested almost $5 billion in this work last year and I think this year it will be of the same order of magnitude. And another question that has arisen is when the horror of a user of any of your platforms, someone has an image there that is very compromising, often of a sexual nature, is there any reason in the world why a person who wants to delete That can't have a very simple answer on the same day? to get it removed I'll start with Twitter, sorry senator, I was taking notes, could you repeat the question?
Well, there are many examples of a young man finding out about an image that is his and really compromising. they and they can actually create suicidal thoughts and they want to call or they want to send an email and say delete it. I mean, why isn't it possible to answer that right away? Well, we all strive to eliminate any kind of uh. infringing content or disturbing content immediately on You can see what the image is. Yes, a system-wide ecos standard would and does improve the user experience across all of our platforms. Okay, there's actually an organization that I think several of the companies here are a part of called "kill it," it's a technology that we and some others, all of you are in favor of that gives people peace of mind.
It's okay, it really matters. I don't have much time, so we've talked about the legislation and, Senator. The White House had asked you to come back with your position on Section 230, which I'll get to in a minute, but I would appreciate each of you responding to your company's position on the bills being considered in this hearing. I'm just asking you to thirdly ask the court this big section 230 question and I'm quite inspired today by the presence of parents who have turned their extraordinary pain into action and I hope other parents don't have to do that. suffering what for them is a devastating loss for everyone, a devastating loss.
The White House senator asked everyone to respond very specifically about Section 230. Where he stands on it, but it's an amazing benefit that his industry has that no other industry has, they just don't have to worry. . about having to be held accountable in court if they are negligent, so you have some explaining to do and I'm just reinforcing the White House Senator's request that you respond specifically on that and then finally I want to ask about this notion, is this idea. from one of a federal agency whose resources and whose job is to deal with issues of public interest that are really affected by big technology.
It is extraordinary what has happened in our economy with technology and its companies represent innovation and success, but only like when the railroads rose and were in charge and ripping off farmers because of practices they could get away with, like when Wall Street was flying high but there was no one to regulate Blue Sky laws. We now have a whole new world in economics and Mr. Zuckerberg. I remember you testifying on the Energy and Commerce Committee and I asked you your position on the concept of a federal regulatory agency. What I remember is that you were sure that that is still the case.
I think so. could be a reasonable solution, obviously there are pros and cons to doing it compared to the current normal structure of having different regulatory agencies focused on specific issues, but because many of the things offset each other as one of the issues that we talk about today It's encryption and that's obviously very important for privacy and security. Can we move on? I'm at the end, but thank you, my yino senator. I think the industry initiative to have those conversations would be something that X would do. Be very, very proactive if you think about our support for the report, act, act shield, act stop seeam, our support for Project SAFE, children's law.
I think our intentions are clear to participate and, so far, yes, Senator, um, we support national privacy legislation for example, it sounds like a good idea, we just need to understand what it means, okay, Mr. Spiel, Senator, we will continue working with your team and we would certainly be open to exploring the right regulatory body for Big Tech, but the idea of ​​a regulatory body is something that, as you can see, has merit, yes, Senator and Mr. Cen, yes, we are very open to work with you and our peers and anyone else to help make the Internet a safer place.
I think he mentioned this. It's not a single platform problem, so we're looking to collaborate with other companies and with nonprofits in government. Thank you all, Mr. President, I yield, thank you Senator Welch. Well, let's conclude this hearing and thank you all for coming. today you probably have your scorecard, have met at least 20 members of this committee and have your own impressions about their approach to questioning and the like, but the one thing I want to make clear as the chairman of this committee for the last three years. Was this an extraordinary vote on an extraordinary issue?
A year ago we passed five bills unanimously in this committee, you heard that every senator, every point on the political spectrum was covered, every senator voted unanimously for the five laws that we discussed today I should tell everyone that Capitol Hill Washington follows a pretty harsh message, we understand it and we live it as parents and grandparents, we know what our daughters, sons and other people are going through, they cannot face it, they cannot handle this problem on their own. They are counting on us as much as they are counting on the industry to do something responsible, and some will leave with the impression from our Witnesses and the companies they represent that you are right as an American citizen, but you should also leave. with the determination to keep the focus on us to do something, not just to hold a hearing, gather a good crowd of supporters of change, but to get something done, no excuses, no excuses, we have to bring this to a vote.
What I found. in my time in the House and Senate that is the day that is the moment of reckoning speeches despite press releases and the like the moment of reckoning is when we call a vote on these measures it is time to do that I don't think there has ever been a time in the wonderful history of the United States when a company or industry has stepped up and said: regulate us, put some legal limits on us. Companies generally exist to be profitable and I think we have to support that and say profitability, at what cost?
Senator Kennedy's Republican colleague said: is our technology greater than our humanity? I think it's a fundamental question and he asked what would I add or is politics greater than technology, let's find out. I want to thank a few people before closing. I have several staff members up here who worked very hard on this Alexander Gelber. Thank you very much Alexander Jeff Hansen Scott Jord. The last point I'll make, Mr. Zuckerberg, is just a little advice for you. I think I think your initial statement about mental health. It needs to be explained because I don't think it makes any sense that there is anyone in this room who has had a child who has gone through an emotional experience like this who wouldn't tell you and me that they changed right before my eyes.
They changed, they stayed in their room, they didn't reach out to their friends anymore, they lost all interest in school, these are mental health consequences that I think come with the abuse of this right to have access to this type of technology, so I'll just do it. I see my colleagues who want to say a word. I think it was a good audience. I hope something positive comes out. Thank you all for attending. The hearing record will remain open for one week for statements and questions. by

If you have any copyright issue, please Contact