Skip to main content

tv   Snapchat Tik Tok You Tube Executives Testify on Kids Online Safety -...  CSPAN  November 17, 2021 10:13pm-1:41am EST

10:13 pm
up next, the hearing on the impact social media platforms have on children. executives from snapchat tiktok and you to testify before a senate subcommittee addressing
10:14 pm
issues from deadly viral challenges to eating disorders. [inaudible] welcome to this hearing on
10:15 pm
protecting kids and social media. i want to thank the ranking member, senator blackburn who has been a close partner. chairman cantwell, ranking member wicker for their support, i'm joined by senator thune and senator amy klobuchar, senator ed markey, they've all been extraordinary leaders in this effort. and today, i should note, is the first time that tiktok and snap have appeared before congress. i appreciate you and youtube for your testimony this morning, it means a lot. our hearing with facebook whistleblower frances haugen, was -- a powerful -- profits above people,
10:16 pm
especially -- and there is a definite and deafening drumbeat of continuing doubt about facebook. keeping -- and outrage. and have led to increasing calls for accountability. this time is -- accountability to parents and the public accountability to congress. accountability to investors and shareholders, accountability to the securities and exchange commission and other federal agencies. because there is ample credible evidence as we start an investigation here. today we are concerned about continuing to educate the american public and ourselves about how we can face this
10:17 pm
crisis. what we learned from disclosures and reporting since then is abhorrent. about instagram's algorithms creating a perfect storm, in the words of one of facebook's own researchers. as that person said, it exacerbates downward spirals, harmful to teenagers, feeling hate and violence, prioritizing profits over the people that it hurts. in fact, the algorithms push emotional and provocative content, toxic content that amplifies depression, anger, hate anxiety. because those emotions attract and hook kids and others to their platforms. in a fact, the more content and more extreme versions of it are
10:18 pm
pushed to children, who expressed an interest in online bullying, eating disorders, self harm, even suicide. and that's why we now have a drumbeat of demands for accountability, along with the drumbeat of disclosure. but we are hearing the same stories. and reports of the same harms about the tech platforms represented here today i have heard from countless parents and medical professionals in connecticut and elsewhere around the country about the same phenomenon on snapchat, youtube and tiktok. in effect, the business model is the same. more eyeballs means more dollars.
10:19 pm
everything that you do is to add users, especially kids, and keep them on your apps for longer. i understand, from your testimony, that your defense is, we are not facebook. we are different. and we are different from each other. being different from facebook is not a defense. that barr is in the gutter. it is not a defense to say that you are different. what we want is not a race to the bottom but really, a race to the top. so we will want to know from you what steps you are taking to protect children, even if it means forgoing profits. we want to go over what research you've done. some of the studies and data that have been disclosed about facebook. we want to know whether you
10:20 pm
will support real forms, not just mile changes that you suggested and were counted in your testimony. the picture that we've seen from an endless stream of videos that is automatically selected by sophisticated algorithms shows that you to drive and describe to find something that teenagers will like. and then drive more of it to them. if you learn that the teen feels insecure about his or her body, it's a recipe for disaster. because you start out on dieting tips, but the algorithm will raise the temperature, for that individual with more and more extreme messages. and even after a while, all of the videos are about eating disorders. it's the same rabbit hole driving kids down those rabbit
10:21 pm
holes created by algorithms that lead to dark places and encourages more destructive behavior and violence. we have done some listening. i heard from norah in west port, who allowed her 11 year old daughter a.g. on tiktok, because she thought it was just girls dancing. she wanted to exercise more with a shutdown of school and sports. so like most people, she went online. nor told me about the rabbit hole that tiktok and youtube's algorithm pulled her daughter into. she began to see ever more extreme videos about weight loss. she started exercising compulsively and ate only one meal a day. her body weight dropped dangerously low and she was diagnosed with anorexia. avery is now in treatment but the financial cost of care is an extreme burden and her
10:22 pm
education has suffered. we heard from parents, who had the same experience. so we not only listen but we checked ourselves. on youtube, my office created an account as a teenager. like avery, we watched a few videos about extreme dieting and eating disorders. they were easy to find. youtube's recommendation algorithm began to promote extreme dieting and eating disorder videos each time we open the app. these were often videos about teenagers starving themselves. as you can see from this poster. the red is eating disorder related content, the green is all the other videos. one is before and the others after the algorithm kicked in. you can see the difference. we also received these recommendations each time we
10:23 pm
watched other videos. it is mostly eating disorder content. there was no way out of the rabbit hole. another parenting connecticut wrote to me about how their son was fed a constant stream of videos on tiktok related to disordered eating and calorie counting after looking up athletic training. as scientific research has shown, eating disorders and body comparison also significantly affect young men on social media -- young men often feel compelled to bulk up, to look a certain way. again, i heard about this all too often. we created an account on tiktok and troublingly, it was easy searching tiktok to go from men's fitness to steroid use. it took us only one minute. one minute to find tiktok
10:24 pm
accounts openly promoting and selling illegal steroids. we all know the dangers of steroids and they are illegal. all of this research and facts and disclosures send a message to america's parents. you cannot trust big tech with your kids. parents of america cannot trust these apps with their children. and the big tech cannot say to parents, you must be the gatekeepers. you must be social media co-pilots. you must be the app police. because parents should not have to bear that burden alone. we need stronger rules to protect children online. we'll transparency, we'll
10:25 pm
accountability. i want a market where the competition is to protect children, not to exploit them. not a race to the bottom. a competition for the top. we have said that this moment is for big tech, a big tobacco moment. of truth i think there is a loh to that contention because it is a moment of reckoning. the fact is, that like big tobacco, despite knowing its products can be harmful -- celebrities, fashion, beauty, everything the peels to young audiences -- like big tobacco, facebook hid from parents in the public, the substantial evidence that instagram could have a negative effect on teen health.
10:26 pm
but the products are different. big tech is not irredeemably bad. like tobacco. big tobacco and the tobacco products, when used by the customer, is the way the manufacturer intended and actually kills customers. as ms. haugen said, our goal is not to burn facebook to the ground, it's to bring out the best to improve and impose accountability. as she said, we can have social media we enjoy, that we can access without tearing apart our democracy. putting our children in danger and sowing ethnic violence around the world, we can do better. and i agree. thank you and we will turn now to the ranking member. >> thank you, mister chairman. and thank you to our witnesses for being here today. we do appreciate it.
10:27 pm
mister chairman, i thank your staff for the good work they have done working to facilitate these hearings on holding big tech accountable. we appreciate that. and today, the conversation is something that is much needed. and it is long overdue. for too long, we have allowed platforms to promote and glorify dangerous content for its kids and teenage users. in the weeks leading up to this hearing -- from mental health professionals who are all wondering the same thing, how are we going to let this continue? and what will it take to finally crack down on the viral challenges, the illicit drugs, the eating disorder content and the child sexual abuse material. we find this on your platform.
10:28 pm
and teachers, parents and mental health physicians cannot figure out why you allow this to happen. it seems like every day that i hear stories about kids and teens who are suffering after interacting with tiktok, youtube and snapchat. kids as young as nine have died doing viral challenges on tiktok. and we've seen teen girls lord into inappropriate, sexual relationships with predators on snapchat. your parents. how can you allow this? how can you allow this? i've learned about kids and teens who commit suicide due to bullying because they suffered on the sites. and the platform's refusal to work with law enforcement and
10:29 pm
families to stop these harassment when it happens. if it was your child, what would you do? to protect your child? does it matter to you? my son has viewed abusive content featuring minors and videos of people slitting their wrists on youtube. it's there. yeah all the while, kids and teens are flocking to the sites and increasing numbers, and the platforms love it. as they know that youth are a captive audience. watch when will continue as consuming the content that is fed to them through these algorithms, even if it is putting them in danger. they're curious. they get pulled down the rabbit hole. they continue to watch. and these platforms are getting more and more data about our
10:30 pm
children, but do we know what they're doing with that data? in that case of facebook, we learned that they're using it to sell their products to younger and younger children. those who cannot legally use their services, but these platforms, you all know, you have children on these platforms, that are too young to be on these platforms. and you a low it to continue because it is money. money in the bank. it is money in your paycheck. and obviously, money trumps everything. and with some of our witnesses today, we have real reason to question how they are collecting and using the data to get from america children and teens. i've made no secret about my concerns that tiktok, which is earned owned by beijing own bite dance, is paving the way for the chinese government to
10:31 pm
gain a better unfettered access to our children and teens. and tiktok, describe despite vacature insist can say i'm quoting, store data outside -- has not alleviated my concerns in the slightest. in fact, earlier this year, they changed their privacy policy to allow them to collect even more data on americans. now, they want your face prince and voice prince in addition to your due location data and your keystrokes patterns and rhythms. they also collect audio that comes from devices connected to your smartphones like smart speakers. they collect face and body attributes from videos, as well as the objects and scenery that appear in those videos this makes sense to some degree to create videos that give given
10:32 pm
china's propensity to surveil its own citizens, why should we just trust that tiktok through bytedance isn't doing the same to us? most americans have no idea this is happening. and well they'd have done things to reassure us by creating u.s. based offices for their high priced lobbyists and marketing personnel, it just is not enough. we see through it all as we will get into today. tiktok's own privacy policies give them an out to share data with bytedance and the chinese communist party. yet most americans have absolutely no idea. that the chinese communist party is getting their information. the time has come were me was focused on what congress can do to secure american consumers
10:33 pm
personal data. as a matter andy grandmother, i know this is doubly important when it comes to our children. as this hearing will show, thank you to the witnesses we look forward to your cooperation. thank you, mister chairman. >> thanks senator blackburn. we have us with us this morning miss jennifer stout, vice president of global public policy of snapchat. vice president of global policy and product at snapchat. she spent most of her career in government working for then senator joe biden at the united states department of state. mr. michael beckerman, is vice president of pulling like policy at america's -- and tiktok. he join tiktok in december of 2020. he's previously the founding president and ceo of the
10:34 pm
internet association. joined by miss leslie -- vice president government affairs and policy of you to. miss miller previously served as acting head of global policy for google. why don't we begin with your testimony. >> thank you mr. chairman. chairman blumenthal, ranking member blackburn, and members of the subcommittee, thank you so much for the opportunity to be here today. my name is jennifer stout and i am the vice president of global public policy at snap. i've been in this role for nearly five years after spending almost two decades in public service. more than half in congress. i have tremendous respect for this institution and the work that you are doing to ensure that young people are having safe and healthy on life experiences. to understand snaps approach at
10:35 pm
protecting young people our platform, it is helpful to start at the beginning. snapchat's founders were part of the first generation to grow up with social media. like many of their peers, they saw that while social media was capable of making a positive impact, it also had features that troubled then. these platforms encourage people to broadcast their thoughts permanently. young people were constantly measuring themselves by likes, by comments, trying to present a perfect version of themselves, because of social pressures and judgment. social media also evolved to feature an endless scene of unvetted content. exposing individuals to a flood of viral, misleading and harmful information. snapchat is different. snapchat was built as an antidote to social media. from the start, there were three key ways we prioritize privacy and safety.
10:36 pm
first, we decided to have snapchat open to a camera and said a feat of content. we created a blank canvas for friends to visually communicate with each other in a way that was more immersive than tech. second, we embraced strong principles and the idea of f emerati. making images delete by default. social media may have normalized having a permanent record of conversation online. but in real life, friends don't break out there a tape recorder to document every conversation. third, we focused on connecting people who were already friends in real life by requiring that both snap chatters opt in to be friends in order to communicate. because in real life, friendships our mutual. we have worked hard to keep evolving responsibly, understanding the potential negative effects of social media, we made proactive choices to ensure that all of our future products reflected those early values.
10:37 pm
we were influenced by existing regulatory faith frameworks that govern broadcasts and communication in developing -- parts of our app reserves had the potential to reach a large audience. discover, which is our close content platform and features professional media publishers and verified users. at spotlight, we're creatures can submit creative and entertaining videos to share our both structured in a way that does not allow unvetted content to get reached. our design protects our audience. when it comes to young people, we've made intentional choices to apply additional protections to keep them safe. opted responsible we've adopted responsible design principle and rigorous processes that consider the privacy and safety of new features from the beginning. we take into account the sensitivities of young people. we intentionally make it harder for strangers to fight minors by not allowing public profiles
10:38 pm
for users under 18. we have long deployed age dating tools to prevent minors from viewing age regulated content and adds. we make no effort and have no plans to market two young children. individuals under the age of 13 are not permitted to create snapchat accounts. and if we find them at, we remove them. additionally we are developing new tools that will give parents more oversight over how their teens are using snapchat. protecting the well-being of our community is something we approach with both humility and determination. over 500 million people around the world used snapchat a 95% of our community says snapchat makes them feel happy, because it connects them to their friends. we have a moral responsibility to take into account the best interests of our users and everything we do. and we understand there is more work to be done. as we look to the future, we
10:39 pm
believe that regulation is necessary. but given the different speeds at which technology develops and the rate at which regulation can be implemented, regulation alone can get the job done. technology companies must take responsibility to protect the communities they serve. if they don't, government must act to hold them accountable. we fully support the subcommittees approach and efforts to investigate these issues and we welcome a collaborative approach to problem solving that keeps our young people safe online. thank you again for the opportunity to appear today and i look forward to answering your questions. >> thanks mrs. stout. chairman blumenthal, ranking member blackburn and members of the subcommittee, my name is mike beckerman, and the vice president of the public policy for tiktok. i'm also the father of two young daughters. and passionate about ensuring that our children see stacey fun light. the i joined tiktok after a
10:40 pm
decade of representing -- because i saw an opportunity to help tiktok responsibly grew from a young start-up to a trusted entertainment platform. tiktok is not a social network based on followers or social graphs. it's not an app people check to see their friends are doing. you watch tiktok. you carried on tiktok. passion and creativity and diversity of our community has field new cultural trends, chart topping artists, and businesses across the country. it's been a bright spot for american families to create videos together and i've heard from countless friends, family and even members of the senate and your staff about how joyful in front and entertaining and authentic tiktok content trump truly is. and i'm proud of our work that our team does every day to safeguard our community. and that our leadership makes safety and wellness a priority, particularly to protect teens on the platform. being open and humble is
10:41 pm
important to the way we operate at tiktok. the context of the meeting today [inaudible] and stakeholders to constantly improve. lost where we can do better. hold ourselves accountable and find solutions. turning a blind eye in areas where we can improve is not part of our company's dna. most importantly, we strive to do the right thing, protecting people on the platform. when it comes to protecting minors, we've worked to create age appropriate expenses for teens throughout their development. we proactively built for privacy and safety protections with this in mind. for example, people under 16 have their account sent to private automatically. they can't host live streams and they can't send direct messages on our platform. and we don't allow anyone to sign off platform videos, images for length by a direct message ink. these are perhaps underappreciated product choices that go a long way to protect teens.
10:42 pm
beneath these decision, which are counter to industry norms or our own short term growth interest, because we're committed to doing what's right, building for the long term. the support parents in their important role to protect teens. that's why we've built parental controls called family -- that empower parents to link their tiktok accounts in a simple way from their own device as a pivot to their teenagers account, to enable a range of privacy and safety controls. i encourage all the parents that are listening to the hearing today to take an active role in your teens own in-app use. if they're on tiktok, please check out the youth portal, read through our guardians guide and our safety center. our tools for appearance our industry leading, innovative, and we're always looking to add and improve. it's important to know that our work is not done in a vacuum. it's critical for platforms, experts and governments to collaborate on solutions and protect the safety and well-being of teens.
10:43 pm
that's why we parted partnered with common sense networks, to help them ensure and provide age-appropriate content in our under 13 eight experience. we also work with the national -- parity chairs association and digital wellness lab at boston children's lab. and our u.s. content advisory council. tiktok has made a tremendous stride to promote [inaudible] we also acknowledge that we are transparent about the room that we have to grow and improve. for example, we're investing in new ways for our community to enjoy content based on a bunch appropriateness our family comfort. and we're developing more features that empower people to shape and customize their experience in the app. but there is no finish line when it comes to protecting children and teens. the challenges are complex. but we are determined to work hard to keep the platform safe. to create age-appropriate experiences. we do know trust must be earned. and we're seeking to earn trust
10:44 pm
through a higher level of action, transparency and accountability. as well as a humility to learn an improved. thank you for your leadership on these important issues. and looking look for to answering the questions. >> thank you mr. beckerman. >> miss miller, i hope you're with us. >> sorry, i think i'm having a bit of technical difficulty. can you hear me and see me? >> we can hear you and see you. >> thank you senator blumenthal, senator blackburn and distinguished members of the subcommittee. thank you for the opportunity to appear before you. i am leslie miller and i am vice president of public policy at -- -- grow up it is crucial to put in place protections that allow them access to information in
10:45 pm
an age-appropriate way. we do this by creating safe environments that allow children to express their imagination and curiosity and allow families to create the right experience for their children. our internal teams include experts who come from child of element and child psychology backgrounds. they were closely to ensure that product assign reflects an understanding of children's unique needs and abilities and how they evolve over time. the advice from trusted experts in forbes the youtube child safety policies. our child safety specific policies, which i describe endanger greater detail in my testimony, prohibits content that exploits or endangers minors on youtube. we use machine learning and human reviewers across the
10:46 pm
globe, in order to remove harmful content as quickly as possible. between april and june of this year, we removed roughly 1.8 million videos for violations of child safety policies, of which about 85% were removed before they had ten views. we are constantly working to improve our safeguards. we have also invested significant resources to empower parents with greater control over how their children view content on youtube. in 2015, we created you to kids as a way for kids to more safely pursue the curiosity and explore their interests while providing families with tools to control and customize the experience for their families. videos on youtube kids include popular children's videos, diverse new content and content from trusted partners. after we launched a youtube kids, we heard from parents that tweens had different needs
10:47 pm
-- and that is why we work with -- work with child safety and child literacy. we call this supervised experience. we launched this main year on the youtube platform. parents now have the option of choosing between three different choices. content generally suitable for ages nine plus or 13 plus and then most of youtube. the most of youtube option x lose all age restricted content on the main platform. we want to give parents the controls that allow them to make the right choices for their children. a new two kids, and even for youtube for all under 18 users, on the play is off by default. in the coming months, will be launching additional parental controls in the two kids app, including the ability for a
10:48 pm
parents can choose a locked default out of play setting. these settings are also on by -- youtube treats personal information from anyone watching children's content on the platform as coming from a child, regardless of the age of the user. this means that on video is classified as made for kids, we limit data collection and use. and as a result, we restrict or disable some product features. for example, we do not serve personalized ads on this content on our main youtube platform and we do not support feature such as comments or live chat. and to be clear, we have never allowed personalized advertising on youtube kids or youtube supervise experiences. there is no issue more important than the safety and well-being of our kids online and we are committed to working closely to address these challenges. i would like to thank you for the opportunity to appear here today and i look forward to
10:49 pm
answering your questions. >> thanks, miller. we will do five minute rounds. we have votes at 11:00. we have three votes. but i think we will be able to go to the questioning and testimony. and if necessary we will take it for recess. let me begin. as you know by now, in august, senator blackburn and i wrote to facebook asking whether they had done any research, whether they had any facts that showed harm to children. and in fact, they denied it, they dodged the question. they disclaimed that instagram is not harmful to teens. let me ask you, i think it's pretty much a yes or no question, the same question we asked facebook. have any of your companies conducted a research on whether you are apps can have a
10:50 pm
negative effect on children or teens mental health and well-being and whether your apps promoted exception? have you done that research? ms. stout? >> senator, we have conducted research, much of it is focused on our products and how we can improve our products and services. and meet the needs of users. as i mentioned in my testimony, some of the research we did shows that 95% of users say that snapchat makes them happy. >> will you make that research available to the subcommittee? >> yes, senator, we would. >> mr. beckerman? >> we believe that research be done in a transparent way, we partner with external stakeholders to get their feedback. we think it's something that everyone can work together on. we have supported passage of
10:51 pm
the camera act and from the nih -- we'd love to see this done in an external and transparent way. >> ms. miller? >> yes, we work with experts to make sure our products and policy decisions are up to date, based on their insights. >> i've asked whether the research has been done that could show negative affects. you've all indicated that you've done that research and ms. stout has indicated that the company will make this available. i'm assuming that your company will? ms. miller? >> yes, senator, we have published some research and we will make additional available. >> let me ask now about the black box algorithms. as you know, these algorithms as you know, these algorithms exist and they function to
10:52 pm
drive sometimes content that is toxic to kids. more and more extreme versions of it. and the consequences are potentially catastrophic. but companies are doing their own homework. they are evaluating their own effects on kids when it comes to addiction and harm's. let me ask you a similar question. do you provide external, independent researchers with access to your algorithms, data sets and data privacy practices? in other words, if an academic researcher comes to you on child psychology and wants to determine whether one of your products causes teen mental health issues or addiction, they get access to the raw data from you without interference? ms. stout? >> senator, i think it's important to remember that on snapchat, algorithms work very differently.
10:53 pm
very little of our content works algorithmically -- >> i want to apologize. i'm going to interrupt to say that the question is about access to independent research. on those algorithms that you do use. and there is no question that you do have algorithms. correct? >> correct, we do have algorithms but they operate very differently. to your question on whether we have had requests from outside researchers or mental health specialists to common access to the, to my knowledge, we have not. >> but you would provide access? >> yes, it's important to understand that algorithms, for us, they operate differently. so to compare them against different platforms, it's very -- >> it's one of the facts that independent, external researchers would verify? >> indeed. >> -- >> yes, senator, we believe that transparency in algorithms is important.
10:54 pm
we are one of the first companies to publish probably a deep dive on how the algorithm works. we invite outside experts and their staff to see how the algorithm works. additionally, it's important to give additional choice to people. in your feed, for example, on tic tac, you -- give transparency. so external access, okay. >> ms. miller? >> senator, we are very transparent as it relates to the way our machine warning learning works. for example, our quarterly reports that some rise the videos in channels we removed based on violating our community guidelines. earlier this year, we rolled out an additional statistic, called the violence of you right -- >> i apologize for interrupting, really. the question is whether you provide external independent
10:55 pm
researchers with access to your algorithms and data sets. do you? >> we regularly -- i'm sorry, senator? >> do you provide that access? >> we regularly partner with experts in child development, mental health. >> those are experts chosen by you. if somebody independent came and wanted that access, yes or no, would you provide it? >> senator, it would depend on the details. but we are always looking to partner with experts in these important fields. >> well, i'm going to cite the difference between your response and mr. beckerman and ms. stout, which indicate certainly a strong hesitancy if not resistance to providing access. >> let me ask you, ms. miller, i think one of the issues here really relates to the claim
10:56 pm
that these sites are transparent, which is belied by our actual experience. and the fact that they favor regulation. facebook has mounted armies of lawyers and paid millions of dollars to fight regulations. whether it is the responsible reforms to section 230 or privacy legislation or requirements be more transparent on algorithms. according to the details made public last week in a multi-state antitrust case, google has quote, sought a coordinated effort to forestall and diminish child privacy protections in the proposed regulations by the ftc and legislation. that filing describes attempts to encourage facebook and microsoft to fight privacy
10:57 pm
rules and back down on advocacy -- this particular meeting where that exchange occurs. this disclosure made news. but everybody in d.c. really knew it was true. what was known was that it was google's hypocrisy that was finally called out. the fact is that google and youtube have been fighting against privacy behind the scenes for years. it is hidden in plain sight and it is an open secret. we've been watching the ftc we can the existing privacy rules. you have spent vast sums of money fighting california's privacy rules. i want to ask you, ms. miller, what work has youtube done to lobby against congress strengthening online protections for children? and is that report and that claim by the multi state plaintiffs accurate? >> senator, i understand that
10:58 pm
the material you are referencing was regarding our point of view on privacy legislation in europe. our ceo has regularly called for comprehensive privacy legislation in the u.s. and on behalf of youtube, i am not aware of any other efforts, other than to be involved in conversations in a multi-state stakeholder way as it relates to and he legislation or bills that are being introduced regarding the oversight or regulation of companies such as youtube. >> so you are aware of these reports of lobbying against children's privacy and safeguards? they are just totally false? >> i think that we work with lawmaker such as yourself regularly to have conversations to share what we are doing on
10:59 pm
the platform and the updated protections we are putting into place. but also hearing concerns. and to work with you as you contemplate new regulations. >> will you commit that you will support privacy legislation, as has been proposed? >> senator, i'm not steeply involved in the details of any specific privacy legislation. but i commit that we will work with you and partner with you on federal privacy legislation. >> would you support a ban on targeted advertising to children and young teens? >> senator, at youtube, we prohibit personalized advertising on youtube kids, as well as unsupervised experiences. >> would you support a ban? >> we have not waited for laws in order to put those types of protections in place. >> i hope that you will and that is more forthcoming in the next round of questions i give to the ranking member.
11:00 pm
>> thank you mister chairman. mr. beckerman, i want to come to you first. in the path, tiktok has said that it has never nor would it ever sheer and provide user data to the chinese governor government,. yet your privacy policy sales in the data collected to respond to government inquiries. it also says you share data you collect with your parent company and affiliates and that you transmit user information to servers and data centers overseas. and earlier this year, in a chinese communist party acquired an ownership stake and a seat on the board of bytedance. so, does tiktok share user data with its parent company, bytedance? >> thank you, senator. that's an important question and i'm glad you're asking. tiktok does not -- >> excuse me.
11:01 pm
>> quickly please. >> we do not share information with the chinese government and i would like to put you to a citizen lab report which is one of the most well respected global national security f spurts. a research or there's no transmission to the chinese government. tiktok did not -- then the report goes on to say -- >> let me ask you -- >> tiktok does not pose a threat to national security. >> please submit the report for the record. do any bytedance employees have access to tiktok user data or any role in creating their algorithm? >> senator, u.s. user data is stored in the united states or singapore. we have a world renowned usb security team that handles access. >> i understand that you say you store it in singapore. tell me about programmers, product developers and the data teams. are they housed in china? >> senator, like many technology companies, we have
11:02 pm
engineers in the united states and around the world. >> so, they have access to algorithms and data? >> senator, we have engineers in the united states and we also have engineers -- >> that answer is yes? >> what about bytedance says they are fully separate but do -- employees have any access to tiktok user data or input rhythm? >> senator, that's completely different from tiktok. >> no, it's a related company. you might want to check that. if the chinese communist party ask you for u.s. user data, what is to stop you from providing it since they have a seat on the board of bytedance? and they have a financial stake in the company? >> senator, that's not accurate. one, they do not have a stake in tiktok at all. >> yes, they do. >> senator, that's not accurate. >> that is bytedance. we can clarify that for the record. but the record is that the chinese communist party acquired a stake in bytedance
11:03 pm
and august and they now have a seat on the board. so, let's talk about tiktok's privacy policy. it says you collect and keep a wide variety of information, including biometric data, such as face prince, voice prince, geolocation information, browsing and search history. not just on tiktok but on other apps. and keystroke patterns and rhythms. why do you need all of this personal data, especially on our children? which seems to be more than any of their platform collects. >> senator, many outside researchers and experts look at this point out that tiktok actually collects less data than many of our peers. on that keystrokes issue -- >> outside researchers that you're paying for? >> no, senator. >> you would submit that to independent outside researchers. because what we are seeing with all of this biometric data and
11:04 pm
the keystroke patterns that you are exceeding that. so, what do you do with this? are you creating a virtual view of the children that are on your site? >> senator, i don't know what you mean by a virtual. you >> well, a virtual view is you and your presence on life. it's like a virtual dossier. i'm of sure you understand that part. and what you need with all of this information. do you track children viewing patterns. are you building a replication of where they go? their search history? their voice? they're biometrics? and why does tiktok and bite dance and dorian, need that information on our children? >> senator, tiktok is an entertainment platform where people watch and enjoy, create short form videos, it's about
11:05 pm
uplifting, entertaining content. people love it. >> i disagree -- >> that is it from the positive but there's also a negative. and the negative is that you are building a profile, a virtual you. of our children. . you mention the family, parent provision that you have. so, so are they opening their data to tiktok? and is tiktok following them. or following and capturing their searches. tracey mr. beckerman, when you capture all of this data, did you hold all of this data? then you are seeing the private -- invading the privacy of individuals that are on your site. that applies to you, to ms.
11:06 pm
stout, to ms. miller, because you say because you are using the platform, we can do this. but in essence, what you're doing is making our children and their data the product. because you turn around and you sell it and then basically it becomes weaponized against their users. mister chairman, i'm overtime. i have several questions for me stout and miss ms. miller. >> we'll have the second. senator klobuchar. reports indicate that nearly half of kids nine to 12, and a third of kids aged at seven to nine, use social media. platforms like facebook, instead, snap, tiktok, youtube. i don't think parents are going to stand by why while our kids
11:07 pm
and our democracy become collateral damage to a profit game. i heard last night marks a cupboards words to his earnings report. and while he may be out there playing the victim, at his 29 billion dollars quarter earnings report meetings, their true victims are the mom in duluth who can't get her kid off facebook made to do their homework. the dad losing a child to a snap feed filter. that measure their kid going 123 miles per hour, trying to beat the filter. or a child exposed to content glorifying and eastern disorder on tiktok. so, i have had a case right in my state -- two cases actually -- of the young people who got
11:08 pm
drugs through snap. i wanted to first start out with that, with you ms. stout. this is a story, ryan back gerson. he was suffering from dental pain at the beginning of the pandemic. he'd been given a percocet before and a classmate said, -- well what this young man did not know is that this percocet was leased with fentanyl and he died just like that. as his mom said in a letter to me, all of the hopes and dreams we had as birds were erased in the blink of an eye. appearance including devin's mother demanded answers and accountability from staff on this issue in a letter to you in september. missed out i want to know what
11:09 pm
the answers are. will you commit to providing more information about the automated to snap uses to proactively search for illegal drug relating content as the appearance asked? >> senator, i very much appreciate your reading of the issue because it has been a devastating crisis that has been afflicting our young people. i want to make clear we're absolutely determined to remove all cocktail or from snapchat. we've been very public about our efforts in this piece. first of all, we have stepped up our operational efforts. and my heart goes out to the families. i have met with bridgette a. met with her back in april. i've heard from her and other families who understand what is understand. we have stepped up, we have deployed proactive detection measures to get ahead of what the drug dealers are doing. they are constantly evading our tactics. not just on snapchat but every platform. we follow so stepped up work
11:10 pm
with law enforcement. just last, week we had a law enforcement summit where we gathered over 2000 members of law enforcement across the country, so that we can understand what they're dealing with. and find out best practices on how we can get them in the information they need to help their investigation. finally, senators, this is so important. we have deployed an education and awareness campaign, because what is happening on our platforms all across social media and technology platforms, is that young people who are suffering from mental health and stress induced by the pandemic and other issues, they're reaching for substances oftentimes pills and opioids. but the substances are laced with fentanyl. and a fentanyl to kill them. >> but here's my problem. if a kid had just walked into to say, a pharmacy, he wouldn't be able to buy that or get that. but in this case, they can get on your platform and just find a way to buy. and that is the problem. i want to know, are you going
11:11 pm
to get -- i appreciate everything you said. i appreciate you meeting with a month. are you going to get drugs off snapchat when you have all these kids in america looking at these platforms? >> i assure you, this is such a top priority for a tire company. and senator, it's not just happening on our platforms, it's happening on others. so, therefore we need to work collectively with other platforms. the other companies that are here today. have to work together. >> thank you. i think there's other ways to do this, to. as of creating liability when this happens. that will make you work even faster so we don't lose another kid. mr. beckerman, recent investigations by the wall street journal found that tiktok algorithm can push young users into context glorify eating disorders, drugs, violence. have you stop that? >> senator, i do agree with the wall street journal went about saying -- we have a number of
11:12 pm
requirements to have age-appropriate conte on tiktok. >> okay, and what are those changes? are you still are they completely protected now from this content? >> the content related to drugs, as you're pointing out, violates our community guidelines. over 97% of violent content is removed proactively. of course, we want to get to 100 percent. >> are you aware of any research studies conducted about how your app the push content promoting eating disorders to teens? >> no, senator. >> okay. did you ask for any proof, studies on eating disorders before testifying? >> not that i'm aware of. we do work with outside experts to understand all of these issues. i think he should be down a transparent way. as i mentioned earlier, i'd love to see -- perhaps we could have additional research in the public domain that all of us could learn from. >> all right, i'll save my questions for ms. miss miller
11:13 pm
in the next round. thank you. >> thanks senator klobuchar. again, i'd remind everybody that you've committed to provide research that you refer to when you responded to senator klobuchar. we look forward to receiving it in days or weeks, not months. particularly appreciate senator klobuchar's move statement as related to liability. i understand -- >> mr. chair if i could put this letter in from the pair into the record. >> yes. >> thank you. we've been joined by senator cantwell remotely. >> mister chairman, i would defer to my colleague senator mark -- and senator baldwin. >> senator martin? tor markey.>> thank you, mister. very much. the problem is clear.
11:14 pm
you concentrate on children and teens to make more money. now is the time for the legislative solutions to these problems. and that starts with -- i've introduced bipartisan legislation to give children and teens the piracy bill of rights for the 21st century. today, a 13 year old girl on these apps has no privacy rights. she is nobility to say no. no, you can't gobble up data about. we know, you can't use that data to make algorithms that push popular content towards me. no, you can't profile me to manipulate me and keep me glued to your apps. no, you have no rights. 13 year old girl in the united states of america, in the year 2021.
11:15 pm
my bipartisan children privacy protection act gives 13, 14 and 15 year olds that right to say no. to each witness, do you support my legislation to update the privacy protection act to give that 13 year old, 14 year old and 15 year old control of their data? ms. stout? >> senator markey, i want to say that i absolutely support a federal privacy proposal. we have worked hard with members of the body -- >> do you support my child's protection rights protections? >> we believe there should be additional protections to protect -- >> so you have had a chance to look at this children's online privacy protection updates, they have been out there for a year, do you support it, yes or no -- >> senator --
11:16 pm
>> this is what drives us crazy. we want to talk, we want to talk, we want to talk. this has been out there for year, this bill. you still do not have a view on it. do you support it or not? >> we would like to work with you, senator. this is -- >> this is just a game. mr. do you support the act being updated the way my legislation does? >> yes, we agree that it needs to be updated, particularly as it relates to -- it's an area that i -- have not given as much attention but we do -- agree that it should be updated. >> the support my legislation to update it? you've had plenty of time to look at it. >> i like your approach however i believe that there should be included a better way to verify age across the internet and i think with that improvement it is something that we would be happy to support. >> ms. miller? >> senator, we also support the
11:17 pm
goals of updated comprehensive privacy legislation on your specific bill, i would leave your conversation with the staff. and i would welcome continuing to do that. >> it is going to happen soon. thank you senator blumenthal, thank you senator blackburn. [inaudible] your platforms are full of young teens. among young teens, 47% say they are -- -- say they are on snapshot. 81% say they are on youtube. they deserve the right to say no, you cannot track me. do you agree with that? do you agree with that? you can track me? >> yes, i agree with that. >> do you agree with that mr. pecker man? >> yes, senator. >> do you agree with that, ms. miller? >> yes, we have tools for all
11:18 pm
of our users to handle and control and make choices as it relates to the information that is gathered. >> the bill also would ban targeted ads to children that should never be allowed to track a ten year old browsing history and bombard him with ads based on that data. ms. miller, you said that you two kids platform prohibits targeted ads to children. do you agree that congress must ban targeted ads to children? >> senator, i defer to you and your peers in terms of what you would want to move forward. but again, we have not waited for laws like this -- >> no, i understand. would you support a uniform banning of this practice. you've already adopted it as a country. would you support that being the standard that we have for all platforms across the
11:19 pm
country? >> as you describe it, it's consistent with the approach we have already taken. >> and you would support? that is what you are saying? >> senator, again, we are already doing this. >> no, i'm trying to -- >> i've already said -- >> we are trying to draft a law. would you support the -- being in the law, the -- prohibited? >> senator, yes. we already prohibit -- >> it's a yes so that we can legislate it. so we can legislate. ms. beckerman and ms. stout, same question, should we been targeted ads to kids? >> we offer that already. >> would you support that as a national law? >> senator, an example of that has been in the age-appropriate design code which we add here to. we are looking at exactly that -- >> do you support it as the law to be passed. if you say it's wrong, should we prohibit it?
11:20 pm
>> we offer those and we agree with the approach -- >> so yes, do you support it? yes or no? >> we agree with your approach so you are applying it to -- >> i appreciate what you are doing. do you support it? will you support a law that -- >> thank you senator -- i think we are very close, senator. >> mr. beckerman? >> yes, senator, and i would also say we should go beyond that. certain categories should not be shown to kids and young adults at all. >> okay, so we also need to go beyond privacy and design features that harm young people. changing like buttons. senator blumenthal and i have a bill, the kids act, which would ban these and other features that quantify popularity. these features turn apps into virtual popularity contests. and they are linked to feelings
11:21 pm
of rejection, low self-worth and depression. even youtube kids has acknowledged this problem and does not have like buttons. each whiton witness, should congress prohibit these features that amplify popularity -- among content for kids? >> we don't think it should be a popularity contest so we don't -- >> -- >> i think this one is a little more complex. but we have implemented much of the age-appropriate design code here in the united states and would encourage that elsewhere. >> so you would -- i don't know that there was any answer in that. it's complicated. do you support ending it or not? >> bending? likes >> spending likes, yes. >> again, if you want to set it by age -- >> ms. miller? >> senator, as you've noted, we
11:22 pm
already prohibit things on youtube kids such as being able to comment. we would support working with you in regulation in this area. >> so you would support working with us, but would you support banning likes? >> yes, senator, we are reddit or two -- on the youtube platform. >> the american academy of pediatrics says to create a national state of emergency around children and teen mental health. we need to outlaw the online features that exacerbate this crisis. the question that we have to answer, ultimately, is whether or not, for example, we are going to ban auto play for kids. the feature of when one video ends another one begins, it could stay glued to their phone so the apps collect more data and make more money.
11:23 pm
82% of parents are worried about their kid screen time. to each of you today, do you agree that congress should ban auto play for kids? yes or no? ms. miller. we will start with you this time. >> yes, senator, each of the items you are outlining, we already prohibit. we have auto default set to auto playoff on youtube kids as well as for supervise experiences. >> so you would support that being legislated? >> yes, sir. >> mr. beckerman? >> we have take a break videos and time management tools, senator. but as it relates to time management on tiktok -- >> so would you support that legislation passing? >> we would be happy to talk to you about it. >> you don't do it. >> again, i think it's important as we look at
11:24 pm
age-appropriate features 14, it's something we built into tiktok proactively. but as we look at legislation, i think a first app is something around age verification process. >> again, this is a historic problem. ms. stout, would you support it? >> senator markey, i don't believe we have auto play on snapchat so i would defer and say that that is something we need to look at more closely. i'm not familiar with that piece, the proposal in your legislation. >> we have a lot of work to do and we have -- i in my time, mister chairman. >> [inaudible] finally start acting. >> thank you, senator markey, thank you for all your good work. i see senator cantwell. oh sorry, senator -- [laughs] >> thank you, mister chairman. we all know social media offers a lot of benefits and
11:25 pm
opportunities. but i have concerns about the lack of transparency online and the limited accountability of big tech. one of the major problems in social media that has been increasingly concerning is the social media platforms used, the algorithms to shape user experience, resulting in individuals being trapped in a filter bubbles. solar bubbles can be particularly troubling for younger users. for instance, a recent wall street journal article described in detail how tiktok algorithms serve up sex and violence and drug videos to minors. i have a filter bubble transparency act in another bill that would make significant strides in addressing the lack of transparency online and importantly the filter bubble transparency act would give the option to not be manipulated by opaque algorithms. latedlet me ask, do you believe consumer should be able to use
11:26 pm
platforms without being manipulated by algorithms designed to keep them engaged on the platform? i will start with mr. beckerman. >> i agree that there needs to be additional transparency and how the algorithms work. >> ms. miller? >> we do provide transparency, senator, in a way around our algorithms working. >> senator thune, it's important that -- very small set of content. when users get to select interest categories, they then determine the kind of content that they are served up. -- unlimited list of user generated content. >> i don't know if you or ms. miller answered the question, and that is, should consumers who use the social media platform to be able to use them
11:27 pm
without being manipulated by algorithms? >> senator, yes, i agree with you. >> ms. miller? >> yes, senator. >> mr. beckerman, -- the wall street journal article -- served up sex and drug videos to minors. >> senator, thank you for the question. serving up sex and drugs are violations of our guidelines. as for the wall street journal article, we disagree with that being an authentic -- >> your platform is perhaps more driven than any other social media platform available today, more so even then facebook. unlike baseball, tiktok algorithm is not constrained by user social network. on july 19th, 2020, tiktok's former ceo wrote and i quote, that we believe all companies should disclose their algorithms, moderation policies and data flows to regulators, and quote. tiktok also states on its
11:28 pm
website that it makes tiktok source code available for testing and evaluation to get at its transparency and accountability center. as tiktok disclose their algorithms, moderation policies and data flows to any state or federal regulators? >> senator, as we pointed out, we have transparency centers and we have done, i think, over 100 reviews with members of senate staff and others in the u.s. government and would be happy to be transparent with how that works. >> i think that maybe senator blumenthal touched on this but in keeping with tiktok's disclosure of practices announced in july, 2020, would you commit to providing tiktok algorithms, moderation policy and data flows to this committee so that we may have independent experts review them? >> yes, sir. >> thank you. >> ms. miller, does youtube engage in efforts to change its users attitudes, behaviors or
11:29 pm
influence its users in any way? >> senator, when users come to you to, they come to search and discover all types of content, for example, how to bake bread. so watch a church service or to do exercise. and as a result, they are introduced to a diversity of content that is not based on a particular network they are a part of. in so doing, there may be additional videos that are recommended to them, based on sub signals because signals will be over written to make sure that we are not going to get harmful content. >> back to mr. beckerman. all chinese internet companies are compelled by china's national intelligence law to turn over any and all data that the government of man's. that power is now limited by china's border. has tiktok provided data to chinese government on chinese persons living in the united states or elsewhere outside of
11:30 pm
china? >> no, senator. tiktok is also not available in china. i'd like to point out that our servers and u.s. data are stored in the united states. >> does tiktok sensor videos of take men, the famous video of the young man who stood his ground in front of a profession of chinese armies tanks during the tiananmen square crackdown in beijing. >> no, senator. you can find that content on tiktok a few search for. it >> mr. chairman, i would suggest that has already been pointed out, i think there are number of things we need to address. congress need to be heard from in the space. particularly with respect to the use of algorithms in the way that users are manipulated and up as you've already pointed out, particularly, young people. i hope we can move quickly and directly and any meaningful way to address the situation. >> thank you.
11:31 pm
i think we have strong bipartisan consensus on that issue. thank you, senators. >> thank you, chairman blumenthal. i'd like to just know that the series of hearings really began with a revelation that internal research at facebook revealed the negative impacts on teenagers bonnie images, from using the company's instagram platform. we learned based on research by the chairman staff, how quickly somebody on instagram can go from viewing content unhealthy eating to be directed to postings that focused on unhealthy practices, including glorifying eating disorders. i know we don't have facebook on instagram before us today, but i'm particularly concerned
11:32 pm
about the impact that type of content can have on young users. i recently joined senators klobuchar and capito, with whom we sponsored the adele weston act. education on eating disorders on, the letter to facebook and instagram. giving more details about how they handle this issues. but i want to ask, can you basically outline the steps your companies are taking to remove content that promotes unhealthy body image and eating disorders and direct users to supportive resources in stead? in particular, how are you focusing on this issue with regards to your younger users? when we start with mr. beckerman and tiktok. >> think you, senator.
11:33 pm
i myself have to your young daughters. is something i care a lot about. one, i want to show you assure you we do aggressively removed content like you described, that would be problematic for eating disorders. second, we work with our day outside groups and direct people that are seeking help. one thing we encourage people that are struggling with eating disorders or other weight loss issues, come to tiktok to express themselves in a positive way. and lastly, we don't allow ads that target people based on weight loss and that kind of content. >> ms. stout? >> thank you senator baldwin. i want to be clear the content that you described, content that glorifies eating disorders or self harm is a complete violation of our community guidelines. also, as i described earlier, week to allow unvetted, and moderated content from being surfaced up to our users discover, which is argue
11:34 pm
publisher platform. which we partner on with people and publishing companies like the wall street journal or nbc news. all of that content is vetted and moderated ahead of time. >> can i interrupt and ask -- is that done through ei or -- >> no, these are handpicked partners that snapchat has elected to stay in this closed garden of content, which is discover, we will allow certain publishers and media companies to provide news, entertainment content. see empty, see be -- the washington post in fact. so, users can come, look at that content and it's all pre moderated and curated. so it's not unlimited user generated content where you could go down rabbit hole putt taps and access that kind of hurtful damaging content. but i think you raise a very interesting question. what are the -- how are we helping users find
11:35 pm
positive resources? we did conduct research about the mental health effects of body image and self harm. we created a product called here for you. this was created in 2020 at the height of the pandemic. when users are in snapchat, they search anorexia or eating disorders, instead of perhaps being there to content that could be harmful. content which is against our guidelines, we now surface expert resources that show content that can help that user. maybe help them maybe help their friends. but this is a redirection of that kind of search for potentially hurtful and harmful content that then stares the users to resources that help them or member of their circle of friends. >> ms. miller? >> senator, we take a comprehensive a really holistic approach on topics like these. we prohibit content that
11:36 pm
promotes or glorifies figures such as eating disorders. it has no place on our platforms. but we also realize that users come to share their stories about these experiences, find a community. let alone, to find authoritative sources which is what we race up on searches. in addition, we also rule out programs and initiatives such as the with the campaign, whereby we are encouraging users to spend their time, particularly during covid in pursuing healthy habits. so, we look at this in a very holistic way to make sure that youtube is a platform where people come and they have a healthy experience, and we again, prohibit that type of content that glorifies for promotes these issues such as eating disorders. >> if i can follow up as i did
11:37 pm
with ms. stout. when you remove content, how are you filtering back out? are you using artificial intelligence? are you using a team of humans who are looking at that content and deciding whether to remove it or not? ? ms. miller:>> it's a mix, senat. it's a mix of when we develop content policies, we rely on experts to inform the development of these policies. and we have machine learning to help us capture this type of content at scale. you'll see in our quarterly transparency report that more than 90% of content that violates our community guidelines are flagged by machines. then there is a mix of human reviewers. >> thank you. i want to thank the senator blumenthal and senator blackburn for holding the subcommittee hearing. and as the witnesses can see, our colleagues are well informed and very anxious to
11:38 pm
get legislative fixes to things they think are crucial. to protect individuals, protect people's privacy. so, i want to thank them for that. yesterday, motherboard vice had an article that basically the headline was location data from apps are given even when people have opted out. so, basically, i'm going to enter this for the record unless there's objection -- a quote, that news highlights the stark problem that smart for users, they can actually be sure some apps are respecting their exclusive preferences around the tissue. data transfer presents an issue for the location data companies themselves. basically these companies are reporting information about location, even when people have explicitly opted out.
11:39 pm
so, they're continuing to collect this information. that is what the report and researchers and motherboard found. i have a question. do you believe that location data is sensitive data and should be collected only with consumers consent? all the witnesses, please. >> yes, senator. we agree. >> yes, senator, we agree. >> yes, senator. and for users, they have access to their account under my activity and my back out and can modify their settings, delete their history and things of that nature. it's all just one click away. >> so any federal privacy law should make sure that is -- i see a nod. >> yes, senator. >> yes, senator. >> yes, senator. >> thank you. do any of you share location data with the companies in this article may?
11:40 pm
they're a major data >> senator, i've never heard of that company. i'm not aware. >> okay. >> i'm not where the company but we also don't collect gps data. >> well it would be affiliated with them in some way. i mean they're getting this information and we. so, i'm sorry, the last witness if you know -- >> senator, i'm also not aware of the company. >> maybe you can help us for the record on this. so that we. no this is exactly what the public is frustrated about and concerned about. particularly when harm can be done. they go to a website, they say to the website, and don't want my sensitive information to be secured. then there is a conglomerate of data gathering on top of that that is not honoring those wishes as it relates to the interface of those apps. so i think this is exactly why we need a strong privacy law
11:41 pm
and why we should protect consumers on this. in that facebook hearing we had, there was this discussion about advertising and the issue of whether advertisers knew exactly what the content was they were being advertised. now, i get that we're all seeing a migration of major companies like procter & gamble and others who are moving off of the internet. because they're like, i'm done with a. i don't want my ads because it now runs by a system. i do want my ads disappearing to certain kinds of content. what was more startling is that there may be actual deceptive practices here, where people are seeing this content is this, when in reality it is something else. in some of these cases, we are discussing facebook, of hate speech and other content we don't think should be a light. on your websites, do
11:42 pm
advertisers know what content they're being placed with? >>? senator, i can respond to your question. yes, our advertisers do know where their advice housing shows up. and discover, which is that closed, curated garden those. advertisements appear next to publishers and verified users that we have hand selected to allow to appear. so, on a platform like snapchat, there is no broadcast disinformation or hate speech. that's why i think snapchat is in fact a very appealing place for advertisers, because they knew their advertisements will be placed next to save content. >> yes mr. beckerman? >> yes, senator. advertisers come to tiktok political early because our content is known for being so uplifting and fun. we see ads that are very much like tiktok videos, which are the same things.
11:43 pm
>> ms. miller? >> senator, we have worked with our advertising partners over the years to make sure that they have understood that advertising on youtube is safe, in the same way we have worked with significantly to make sure the users themselves have a safe experience on the platform. the advertising association has recognized the work that we have done in the space their brands are safe on the platform. >> thank you. i'll probably have a follow-up on this. but senator lee? >> thank you, madam chair. >> ms. miller, i'd like to start with you if that's all right. i want to ask you a particular question regarding youtube's app age rating. now, google plea has the app rating set at teen. meaning 13 and up. while the apple store has it relate at 17 and up. can you tell me why this
11:44 pm
disparity exists? if apple determines that the age rating for youtube ought to be 17 and up, why does google determine that its own up should be treated as teen, meaning 13 and a? >> senator, i'm familiar with the differences that you've just outlined, but i would be happy to fill up with you and your staff once i get more details on this. >> i love to know about -- it is a simple question and i understand it may not be to answer right now. i understand if you don't have the information. but i would just like to know why that difference exists? whether you agree or disagree with the fact that google has created its own app for 13 at app, we'll appeal is rated 17 up. but i'm happy to follow up on up on that in writing or otherwise. missed out, i want to address a similar issue, regarding snapchat.
11:45 pm
snapchat is rated 12 and up by apple. and it's rated teen on the google play store. any reason why there is a disparity there? >> a very good question. and i for some reason, i've heard that why apple list it's as 12 up is that it's intended for 18 audience. and the -- the content that appears on snapchat is appropriate for -- >> let's hawk about that for a minute. i beg to differ. in anticipation of this hearing, i had my staff grade a snapchat account for a 15 year old child.
11:46 pm
they didn't select any content preferences for the account. they basically had a name and a birth year and an email address. when they opened the discover page on snapchat, with its default settings, they were immediately bombarded with content that i can most politely describe as wildly inappropriate for a child, including recommendations for, among other things, an invite to play an online sexualized video game that marketed itself to people who were 18 and up. tips on, quote, why you shouldn't go to bars alone. notices for video games raided for ages 17 and up and articles about porn stars. let me remind you that this
11:47 pm
inappropriate content that has by default been recommended for a 15 year old child, is something sent to them by an app that is using the default setting. so i respectfully but strongly beg to differ on your characterization of the content, that it is in fact suitable for children 13 and up, as you say. now according to your website, discover is a list of recommended stories. how and why does snapchat choose these inappropriate stories to recommend children? how does that happen and how would that happen? >> so, senator, allow me to explain a little bit about discover. discover is a closed content platform and yes, indeed, we do
11:48 pm
select and hand select partners that we work with. then that kind of content is described designed to appear on discover and resonate with an audience 13 and above. i am on familiar with and have taken notes about what you have said about what your count surfaced. i want to make clear that what content and community guidelines suggest, that any online sexual video games should be to -- i'm not sure why that was in an account for a 14 year old. these community guidelines in publisher guidelines are on top of guidelines been to be an age-appropriate experience -- >> you have these community guidelines, and advertisers and media partners agreed to additional guidelines. one of these additional guidelines and -- i can guess only that they
11:49 pm
permit these age-inappropriate articles shared with children. how would that not be the case? >> senator, these additional guidelines are things that suggests that they may not glorify violence, that any news articles must be accurately fact checked, that there is no -- >> i'm sure the articles about the porn stars were accurate and fact-checker. and the tips on why you shouldn't go to barcelona were fact checked. but that's not my question. it is about whether it is appropriate for children aged 13 and up. >> absolutely, and i think this is an area where we are constantly evolving and if there are instances where these publishes publishers are surfacing content to an age cohort that is inappropriate, then they would be removed from our platform. >> so do you review them? what oversight do you conduct? >> we use a variety of human review and automated review. so i would be very much be
11:50 pm
interested in talking to you and your staff about what kind of content this was because if it violate violates our guidelines, this content would come down. i would agree with you, -- the kind of content that's promoted on discover, there is no content there that is illegal, none that is hurtful. it really is intended to be a closed system where we have better control over the con content that surfaces. >> i will follow up briefly. >> thank you so much, madam chair, then senator lujan will speak. >> snapchat -- how did snapchat decide what content is pushed to the top of their discover page? >> senator, if you go into your snapchat account and the ability to select preferences,
11:51 pm
there are several categories that he's or can select or on select, if they wish. that can be that they like to watch movies or they enjoy sports or their fans of country music. at any point, it is completely transparent and a user has the ability to go in and select what they like and that determines the kind of content. if there is any content that they don't like, they will check and a easier and discover would see -- >> thank you. we've got to get to the bottom of these operating's. they are inappropriate. we all know there is content on snapchat and on youtube and other places that is not appropriate for children ages 12 or 15 and up. >> thank you. it's not appropriate to tell over -- it is next to content that is appropriate.
11:52 pm
senator lujan? >> thank you. you mentioned in your testimony that all content on spotlight is human reviewed before it can be viewed by 25 people. does human review helps that check reduce spot life snapchat reduce potentially harmful content? >> we believe it does, yes. and i believe that the snapchat approach for this problem, what's -- content from going viral, however companies -- one thank you congress and then once it's diverted from the public is distracted, they go around and do the very same warning us against? can i hold you to that? will snapchat continue to keep a human in the loop of what content is algorithmically promoted to large audiences? >> senator, this is the first
11:53 pm
time i've testified here before congress. so, please hold me to it. at snapchat, we have taken a very human moderation first approach. not to put a spotlight on our platform. but human moderation will continue to play a which part of how we moderate content and our platform and how we keep users safe. >> glad to see the importance of platforms taking responsibility before they implement content and especially publish it to a massive audience. it's something many of us share. that's why i introduce the protecting americans from dangerous algorithms act as well. online platforms must be responsible they're actively promoting dangerous, hateful content. miss miller and, grateful youtube is making an effort to be transparent regarding the number of users who view content in violation of the committee guidelines. however i'm concerned of a trend. earlier this year, and were a letter to youtube with 25 of my colleagues on the crisis of non english misinformation on that platform. we need to make sure all
11:54 pm
communities, no matter the language they use at home have, the same access to good, reliable information. miss miller, will youtube published its violation view rates broken down by language? >> senator, thank you for your question and what you're referring to. this latest data point we showed earlier here in which for every 10,000 views on youtube, 19 to 21 of those issues are of content that is by lead. we apply our content policies at a global scale. across languages edges. we do not prints fronts any one language over any other. and this miss -- >> miller, i don't believe that maybe don't break algorithms down by their performance
11:55 pm
around different groups, we end up making existing file biases worse. we've seen it with facial recognition technology that unfairly targeted communities of color. and according to reports, we're seeing this happen right now on youtube. so, alaska again. will you to publish it violative view rate broken down by language? >> senator, i would be happy to follow up with you to thaw through these details, as i said, for all of our content policies. the enforcement there within and that transparency we provide, it is global in scope at is across language. an:>> i definitely look for to following up and following up with you that. space mr. beckerman, before launching tiktok for younger users, did tiktok do any internal research to understand the impact it would have onion children? >> thank you, senator. i'm not aware of -- the content is q retired.
11:56 pm
and it's an age appropriate experience. i'm not aware of any specific research. >> i'd like to follow up on that as well. because products like tiktok can lead to addictive behavior, body image issues in young children, and is critical that platforms work to understand these problems before they take place. this is a very serious issue. and it's one that's finally getting the attention that it deserves with revelations and whistleblowers that have come for. i urge you to take this opportunity to do a public evaluation of the impact your content is having on young children. in the end, i just want to follow up on something that many of us have commented on leading up to these important hearings. and i appreciate the cheers attention to this, the ranking member. both of them have authored legislation. our chair and ranking member of the subcommittee have also partnered on legislative initiatives. it's critically important that we continue moving forward and
11:57 pm
that we mark legislation and get something adopted. i sincerely hope we here in the united states repairing attention to what's happening in other parts of the world. again, europe is outpacing the united states and be responsible with legislative initiatives surrounding protecting their consumers. there's no reason we can't do that here as well. and i just want to thank senator cantwell in the work she's been doing in the space. and i look for to working with everyone that we're able to get this out here. thanks so. much >> senator, you reintroduce that bill in the senate. is that right? >> yes. >> thank you, very much appreciate that. and appreciate your leadership. so, the witnesses were waiting for senator blumenthal. if he does it come in the next minute or so, there will be a short recess. because we're way past time to get over there. but i want to thank you all. and members who participated thus far. because we've had a very robust
11:58 pm
discussion day. you can see that the topics, the members of this committee feel very passionately about. obviously believe that there is much more that we need to be doing in this particular area. so, i appreciate everybody attendance and focusing. again, want to thank senator blumenthal and senator blackburn for their leadership. and for the larger committee, we had planned to move forward and many of these agenda items. we are very appreciative of the subcommittee doing some of this work and members have a chance for a very detailed interaction on these policies that we need to take action on. so, very much appreciate that. so, i see senator blumenthal has returned. thank you so much, and will turn it over to you.
11:59 pm
>> thank you, chairman cantwell. and thanks for your excellent work on this issue. i'd like to ask some additional questions on legislative -- one of the suggestions that senator klobuchar raised was legal responsibility and liability. which as you know, is in section three. let me ask each of you, would you support responsible measures like the -- at which i have proposed. to impose some legal responsibility and liability for cutting back on the community that section 2:40
12:00 am
imposes? >> senator, we agree that snapchat that there should be [inaudible] the intermediary platform liability law. and in fact, the last time this addressed the reform, snapchat was a company that actively participated in that and help draft legislation. and helped draft legislation so we would welcome another opportunity to work with you on that. sen. blumenthal: would you support the earn it act which senator graham and i proposed which proposes liabilities and affords victims the opportunity to take action against platforms that engage in child pornography related abuses? ms. stout: of course. we completely prohibit that kind of activity as we actively look for it. if we remove it when we find it. if you would allow me to get
12:01 am
back to you, it has been a while since i worked on the earn it act. i recall you said senator graham introduced it but we support your legislation. sen. blumenthal: you had the opportunity to say whether you supported it and you have not. will you commit to supporting it? ms. stout: senator, again, my memory is failing me a little bit but i do believe the provisions of the earn and act in many of the provisions we supported. i would be happy to come back with you for a more specific answer. mr. beckerman: we agree there needs to be a higher degree of accountability and responsibility particularly as it relates to content moderation that needs to be done in a way that allows platforms to moderate in an appropriate and aggressive way to make sure the kind of content none of us want to see on the internet or any of our platforms is able to be removed. sen. blumenthal: do you support changes in section 230?
12:02 am
mr. beckerman: there can and should be changes but again, in a way that would allow companies like ours that are good actors and aggressively moderating our platform in a responsible way to continue to do so. sen. blumenthal: do you support the earn at act? mr. beckerman: we agree with the spirit and would be happy to work with you on the bill. it was reported unanimously on the last session. it has not changed significantly. did you support it done? mr. beckerman: the concern would be unintended consequences that lead to hammering a company's ability to remove and police content on platforms it's. sen. blumenthal: is that a yes or no? mr. beckerman: maybe. sen. blumenthal: we have two maybes so far. ms. miller: i am aware of a number of proposals regarding potential updates to 230 and me
12:03 am
and my team as well as other teams across google have been involved in the conversations regarding his proposals. we see 230 as the backbone of the internet and it is what allows us to moderate content and make sure we are taking down content that leads to potentially eating disorders for example. what we have talked about earlier for self-harm. we want to make sure we continue to have protections so we can moderate our platforms so that they are safe and healthy for users. i am aware of the earn it act and i know our staff have been speaking but i understand there is still ongoing discussions regarding some portions of the proposal. we also very much appreciate and understand the rationale as to
12:04 am
why this was introduced, particularly around child safety. sen. blumenthal: is that a yes or no? ms. miller: we support the goals of the earn it act but there are some details that are being discussed. sen. blumenthal: as senator markey has said, this is the talk we have seen again and again and again and again. we support the goal but that is meaningless unless you support the legislation. it took a fight, literally, bareknuckle fight, get through legislation that made an exception for liability on human trafficking. just one, small piece of reform and i joined in a frustration of many of my colleagues that good
12:05 am
intentions, endorsement of purposes and no substitute for actual endorsement, i would ask that each and every one of you support the earn it act but also other specific measures that will provide for legal responsibility and i think i know the claim section 230 provides a backbone but it is a backbone without real spine because all it does is support virtually limitless immunity to the internet and to the companies that are here. it is going to interrupt my second round and call on senator cruz. sen. cruz: mr. beckerman, thank
12:06 am
you for being here today and i understand this is the first time tiktok has testified before congress and i appreciate you making the company available to answer questions. in your testimony, you talked about the things tiktok is doing to protect kids online and that is correct. i wanted to discuss the broader issue, the control the chinese communist party has on tiktok. its parent company and its sister company like beijing technology. tiktok has stated repeatedly it does not share the data collects from americans with the chinese communist party and that it would not do so. it has also stated in regards to the data collected, the data stored in virginia -- these denials may in fact be misleading. a quick look at tiktok's privacy policy just last night shows
12:07 am
there is a lot more than meets the eye. for example, in the quote, how we share your information section, one blurb reads, we may share all of the information we collect with apparent -- up parent, subsidiary, or other corporate affiliate of our group. the privacy policy was updated in june to state that tiktok may collect biometric identifiers and information as defined under u.s. laws such as face prints and voice prints. does tiktok consider bike dance, the parent company of tiktok headquartered in beijing, to be a part of tiktok's corporate group as that term is used in your privacy policy? mr. beckerman: this is an important question and i would like to take an opportunity
12:08 am
first to clear up the misconceptions around some of the accusations that have been leveled against the company. i would like to point to you independent research. sen. cruz: my question is simple and straightforward. this tiktok consider its parent company headquartered in beijing to be part of the corporate group? mr. beckerman: access control is done by the u.s. teams and as independent researchers and experts have pointed out, the data tiktok has on the app is not of a national security importance and is of low sensitivity but we hold that to a high standard. sen. cruz: the words that came out of your mouth have no relation to the question you were asked. your privacy policy, it says you will share information with your corporate groups. i'm asking a simple question.
12:09 am
is your parent company headquartered in beijing part of your corporate group, yes or no? you used determine your privacy policy. mr. beckerman: i think it is important i address the broader point in your statement. sen. cruz: are you willing to answer the question? it is a yes or no question. mr. beckerman: yes, it is. sen. cruz: under your privacy policy, you state explicitly you may share data including biometric identifiers in face prints and voice prints. mr. beckerman: in the privacy policy, it says that if we are to collect biometric information, which we do not, select biometric data is identified as americans, we would provide an opportunity for consent. sen. cruz: you said we may share the information we collect. mr. beckerman: under u.s. access control. sen. cruz: what about beijing
12:10 am
technology, which freaked -- media reports show beijing took a minority stake in through estate back, internet investment chinese entity on -- and on the board of which sits a ccp official who spent most of his career in chinese propaganda including estate -- a stint under the cyberspace administration of china, it's internet regulator. would you consider beijing technology to be a part of tiktok's corporate group with whom tiktok would share the information and collect? mr. beckerman: i want to be clear that entity has no affiliation with tiktok. it is based for domestic licenses in china that has not affiliated or connected to tiktok. sen. cruz: are you saying no or
12:11 am
yes as to whether beijing bytedance technology is part of your corporate group as defined by the privacy policy? presumably, other affiliate is where it would fall. is it another affiliate of your corporate group? mr. beckerman: that entity deals with domestic businesses within china. sen. cruz: you are answering questions i'm not asking. it is a yes or no. is beijing bytedance technology another affiliate of your corporate group? mr. beckerman: i'm trying to be clear to answer your question. that entity is based in china for the chinese business that is not affiliated or connected with tiktok. sen. cruz: that's twice you have not answered. last time you did on the third time. mr. beckerman: the answer is the
12:12 am
same, senator. what i just said. sen. cruz: what you just said did not answer the question. let me repeat the question again. is beijing bytedance technology and other affiliate of our corporate group as your privacy policy defines it? mr. beckerman: senator, as i stated, that entity does not have any relation to the tiktok entity. sen. cruz: it took three questions to answer about your parent and you finally answered you can share all of your information with the parent company based in beijing. i have asked you three times about the sister company that is another affiliate. you have refused three times. that may be revealing. often, as sherlock holmes observed about the dogs who do not bark, it may be revealing the propaganda minister serving on your sister company has been in the business of online
12:13 am
propaganda, you're refusing to answer whether they fall under your privacy policy. that reveals a great deal. mr. beckerman: i'm am just trying to be accurate here. there is a lot of accusations that are just not true and i want to make sure -- sen. cruz: i will give you one more chance and my time is over. in baseball, three strikes you're out. let us see if a fourth strike you can actually answer the question. it is a simple yes or no. is beijing bytedance technology and other affiliate of our corporate group as your privacy policy find the term? mr. beckerman: as i pointed out before, the answer is the same. sen. cruz: you did not answer. mr. beckerman: i appreciate your time with gotcha questions. sen. cruz: i'm asking about your policy. mr. beckerman: if you could be truthful and accurate about -- sen. cruz: are you willing to answer the question? mr. beckerman: senator, i answered the question.
12:14 am
i stated a number of times that that entity is a domestic entity within china. sen. cruz: apples are red. is it another affiliate as defined on your privacy policy? you are here under oath. are you going to answer the question? mr. beckerman: -- sen. cruz: you're refusing to answer because you do not want to? mr. beckerman: i gave you the answer. sen. cruz: i want to be clear because you are under oath. your answer is that it is not a "other affiliate of our corporate group". this is a legal question with consequence. mr. beckerman: i understand the question. as i pointed out, tiktok is not available in china and that is an entity for purposes of the licenses of business in china. sen. cruz: you are refusing to
12:15 am
answer the question, for the record. mr. beckerman: i believe i answered your question on my god -- question. sen. cruz: you are not willing to say yes or no. mr. beckerman: it is not a yes or no question. i want to be precise. sen. cruz: is this company another affiliate defined in your policy? mr. beckerman: the way i answered, i am not aware that is the answer to the question. sen. cruz: so you refuse to answer the question. that does not give any confidence that tiktok is doing anything other than participating in chinese propaganda and espionage. mr. beckerman: that is not accurate and i sen. cruz: if it were not accurate, you would have answered the questions and you have dodged the questions more than anyone else i have seen in my nine years on the senate.
12:16 am
witnesses often try to dodge questions. you refuse to answer simple questions and that, in my experience, when a witness does that, it is because they are hiding something. >> senator moran? sen. moran: mr. chairman, thank you very much. let me turn to ms. stout and ask a question about data privacy. so, senator blumenthal and others have been working on a consumer data privacy bill for several years. i have introduced a bill that includes an appropriate scaled right for consumers to correct and erase data that is collected or processed by entities including social media companie including social media companies.
12:17 am
ms. stout, i you think so that snap allows users to correct their user data . would you please explain the decision to proactively provide this service. >> thank you, senator, for the question. we applied the committee and your leadership on this issue. we fully support a federal comprehensive privacy bill and we look forward to working with you and your staff on that. yes, center, snap is designed with a privacy centric focus from the outset. we do not collect a lot of date and we believe in data minimization and short data retention periods. as you pointed out, we do not store content forever and we believe in giving users transparency and control. that includes the ability to delete data, if they wish, or the ability to download data have a tool in the app that
12:18 am
allows users to download data. it gives them any information they may have agreed to share, post, or put onto there snapchat account. and other platforms who may not take the same position that snap has, what you give up in your ability to earn revenue? what is it that you lose by doing that, if anything? >> we make tradeoffs every day that sometimes disadvantage are bottom line. there are no rules and regulations that require companies like snap to have short retention. or why we choose to voluntarily have a data minimization practice. means advertisers find other platforms perhaps more enticing those platforms keep a history of anything that has been shared or searched or location data that's ever been provided and that's not the case on snapchat. that is a trade-off, senator, because we believe in being more private and more safe.
12:19 am
if you did otherwise, what wd you gain by doing so? >> we just believe that we have a moral responsibility to limit the data we collect on people. >> your answer is fine, but what do you generate by keeping the data and in some other way using it? >> we limit ourselves in our ability to optimize those advertisements to make more money. we are a company that has not yet turned a profit. we have reinvested every dollar into the company and we are here for gain . our desire is to make a platform that is safe for our community. >> ity, and mr. beckerman, what responsibility do platforms have to prevent harmful social media trends from spreading? and how can tiktok improve its algorithm to better comply with the -- their own terms of service to prevent criminal
12:20 am
activity? mr. becker man: thank you, senator. we do have a responsibility to moderate our platform along with community guidelines and to do that in a transparent way. >> where does that come from? mr. becker man: it comes from doing the right thing. we want to be a transparent platform where people enjoy their experience. that starts with community guidelines. certain content that we would not allow on the platform, we work really hard, i think. the safety teams are often the unsung heroes of the company. they work every day 24/7 to make sure community guidelines are met and that the platform stays positive and joyful. >> the nature of my second question, the add-on-question was what's leading because it suggests that you are not complying with your own terms of service that enable criminal activity . my question is, how can you improve your algorithm to better accomplish that? maybe you would discount the premise of the question.
12:21 am
mr. becker man: no, senator, it's an important area where we want to get to 100%. we release transparency reports and 94% of our removal of vile active content are done proactivity. much of it is done within 24 hours before there are views. we strive for 100% and that is something we fight and work on every day. sen. moran: thank you. thank you, chairman, and ranking member senator blackburn. this subcommittee is important to the nature of our country and to its future. thank you. >> thank you, senator moran. >> thank you, mr. chairman. ms. miller, i would like to come to you . you talk about moderation as a combination of machine learning and human review. it seems that youtube has no problem pulling down videos that question abortion or global warming or vaccine mandates. but child abuse videos remain on
12:22 am
your site. so i am interested to know more about the specific inputs that you use in these reviews. who establishes these inputs? who oversees them to make sure that you get them right? >> senator, thank you for your question. so, we heavily invest in making sure that all of our users and particularly kids on the platform have a safe experience. the way we do this is with a number of levers. for example, we have content policies as it relates to child safety on the platform. so, not putting situations and videos on the plat form. sen. blackburn. let me jump in, then, and and
12:23 am
you, if you don't want to put children into risky videos -- there is a world of self-harm content on your site. a few searches come up with videos, and i am quoting from searches that we have done. songs to slit your wrist by. vertical shildt wrists. how to slit your wrist. and painless ways to commit suicide. now, that last video, painless ways was age gated. but do the self-harm and suicide videos violate youtube tees content guidelines, if you're saying you have these guidelines? >> senator, i would certainly like following up with you on the video you may be referencing. we absolutely prohibit content regarding suicide or self-harm. sen. blackburn: i have to tell you, we have pulled these down in our office.
12:24 am
our team has worked on this because i think it is imperative that we take the steps that are necessary to prevent children and teens from seeing this content. and, i just can't imagine that you all are continuing to allow children to figure out how to do this on your site. how to carry out self-harm. yes, why don't you follow up with me for more detail and i would like the response in writing. i also talked to a film producer friend this morning and the film trailer for i am not ashamed, which was based on the story of rachel scott who was the first victim of the columbine attacks. the film focused on her fait and how it helped in her life. so why would you remove this film trailer and block its distributor from being on your site and you did this for 11
12:25 am
months. you did not put the trailer bac up until a hollywood reporter called and said, why have you done this? do you have an answer on that one? >> senator, i'm sorry, but i'm not familiar with that specific -- senator blackburn. : ok, then let's review this. and we sub it in -- submit the documentation that i had sent over to me. ms. stout, over to you . we had an issue in memphis with a 48-year-old man who was on your site. re-- he related to a 16-year-old memphis teenage.
12:26 am
he claimed to be a music producer and he lured her into this relationship. one of the news articles recently called snapchat the app of choice for sexual predators. this is something that is of tremendous concern. much of it, from what i understand from talking to mom's and grandmom's is they use the snap map location service . i know you are probably going to say that only your friends can follow you. somehow people are getting around that. and these sexual principled torreyes, who are following young people are using this map to get to their location. so, we had this in memphis with the rape. we have another child that was in the middle part of the state that the predator followed her and she tried to commit suicide because she knew her family was going to find out. so are you taking steps? do you want to give me a written answer as to the steps you all are taking? how are you going to get a handle on this? this is endangering young women. >> senator, i'm more than happy to give you a detailed written answer. i appreciate your leadership and following up on this issue want
12:27 am
issue. i want to make crystal clear that the x-play tuition of minors is deplorable and we work to prevent this from happening on our platform. with respect to the map, yes, indeed, location -- appearing on the map is off, by default, for everyone. not only for minors but for everyone. so in order to appear to someone and share your location, you must be bidirectional friends with that person. ally say, with respect to grooming, this is an area where we spent time and resources to prevent. snapchat makes it intentionally difficult for strangers to find people they do not know. we don't have open profiles or browsable pictures. we don't have the ability to understand who people's friends are and where they go to scoop. i'd be more than happy to give you more adam.
12:28 am
sen. blackburn: let's get some more detail. and one question for all three of you and you can answer in writing, if you choose. you have all talked about the research work that you do. do you get parental consent when doing research on children? and can you provide us a copy of the parental consent form? we asked facebook for this and they punted the question repeatedly. and ms. miller, i think you need to provide the committee clarity. you said you had never spoken out against online privacy. i think the chairman and i may question that a little bit. my question to you, and you can come back to us with more depth on this. did you fight it as part of the
12:29 am
internet association as they were fighting privacy on your behalf? and just one thing to wrap up mr. chairman, going bac to mr. beckerman. there is confusion around the ownership of tiktok with the parent company as bytedance . they have a financial stake in bytedance. douyin has an interest. i check could and know that tiktok data is stored in the u.s. and singapore. until recently, singapore dat centers were run by alibaba, which is another chinese
12:30 am
company. what we need from you mr. beckerman is the clarity on ownership. the sharing processes around u.s. data services especially the information of our children. with that i will yield back. thank you, mr. chairman. >> thank you, sen. blackburn. wi going to finish the first round. senator soul vandal. i am going to go vote and i should center finishes. in the meantime, senator blackburn will preside. thank you. >> thank you, mr. chairman. mr beckerman, i know there is sharing of data with the chinese communist party. given the own everyship or at least board influence. senator blackburn was just talking about that. i-know senator cruz raiseeds the issues. i want to raise a related sufficient. it's what i refer to as kowtow
12:31 am
capitalism and i think that you guys are exhibits a of cow toy -- kowtow capitalism? what is that? it's american executives of american companies censorin americans' first amendment rights in america so as not to offend the chinese government and/or to gain access to the chinese government. we see it on wall street and in the nba. we see it in the movie industry. let me ask a couple of questions regarding that. a tiktok user could put up a video that criticizes the chairman of this committee and the ranking member or any senator. president biden or former president trump. couldn't they? not like some horrible violent suggestion but just a criticism of an elected official. is that common?
12:32 am
mr. beckerman: actually, tiktok doesn't lock up political ads so political contestant. i. wouldn't be a violation of your community guideline as long as it's not disinformation or something hateful. >> good, that's free speech. that is free speech. could a tiktok user put up a video criticizing xi jinping paying. he has been compared to winni the pooh. could a tiktok user reference him with winy the pooh? i don't know why he doesn't lining winnie the pooh. could a tiktok user do that? mr. beckerman: yes, senator.
12:33 am
community guidelines are done by our team in california and the moderation is done here. >> ok, in 2019 you admitted to sensorring videos ordering tibetable independence. can a tiktok user mentioned tibetan independence? mr. beckerman: yes, senator. >> what happened in 2019? mr. beckerman: i can assure you it's not a violation of the community guidelines and would be allowed on the platform. >> i think there was a tiktok public policy director in 2020 that admitted that tiktok had previously censored information from the ccp about forced labor , is that true? mr. beckerman: that would not be in violation of community guidelines and it
12:34 am
would be permitted. >> so you're saying that your videos have never been censored by the chinese communist party on any matter? i'm just reading these and maybe they're all wrong. mr. beckerman: i can assure you that our content moderation teams are led by americans. our mission guidelines are public and transparent. and content that is critical of any government frankly, as long as there is not disinformation or hateful speech, would be allowed. and i would enkoenning counsel you to search for a number of the examples today up mentioneds on tiktok and i'm sure you could find them. >> so the tibetan and -- independence and the muslim forced labor issues are not censored? >> not currently. >> were they previously? i'm reading here that in 2019
12:35 am
and 2020 that somebody wit admitted doing that. >> i'm not aware of that and that's not how we monitor content. >> listen, i do think this issue of kowtow capitalism where american companies are censoring americans that we see all the time is an issue they should be looking at. because the chinese communist parties, of course, can crush freedom of speech in their own country but they shouldn't be able to crush it in this country. and mr. beckerman, i'm glad that you're denying any of in. i look forward to seeing videos that are negative against the chinese communist party. not really holding my breath but maybe it's true. maybe they're fine with videos criticizing xi jinping and other members of the communist party. so you're saying that's fine and acceptable policy for tiktok
12:36 am
users? >> yes, and just to be clear, there's not involvement from the chinese communist party in tiktok moderation and it's all done in the united states. >> thank you. >> thank you, mr. sullivan, and i will -- we are going to continue looking at these issues on the chinese communist party teats influence into u.s. technology and into the silencing and sense urining of free speech of u.s. citizens online. >> great, it's very important, i appreciate that. senator, you are recognizeeds. >> thank you. i will start directly with questions and if i have time left, i would like to read a statement into the record. i will start with a question for ms. miller. youtube has implemented several
12:37 am
features, such as auto play, that make the platform difficult to stop using. what mechanisms has you to employee to make sure that children, specifically, can counteract these design decisions? and do you believe those controls are sufficient? >> senator, thank you for your question. autoplay's default off on youtube supervised experiences on youtube. we have set those two default off. we do allow that if the default is changed to allow for autoplay. for example if a family is in car and the parents have decided they want autoplay to continue. but we have set it to a default of off. >> do you believe that's sufficient? >> i think it's one of a number of tools that are important to make sure that kids have a healthy and safe experience on the platform. another is that we do not deliver targeted advertising on
12:38 am
youtube kids. another is that we age gate content to make sure that minors do not see ainge-inappropriate material. it's only with a number of tools and protocols in place that do we think we are meeting the bar we have set for ourselves and the parents expect of us. experts in the fields of child development advise us to make sure that kids have a safe experience on the platform. >> thank you. mr beckerman, after my staf reviewed your privacy policy, i want to list some of the items that tiktok automatically collect from one of its users. that person's location, the device model of your phone. their browsing history outside and inside of tiktok. content of all messages sent on tiktok, their i.p. address,
12:39 am
their bio metric identifying information and information from their phone such as key stroke patterns and other apps. do you believe this this sort of mass data collection is necessary to deliver a high quality experience to your users? mr. beckerman: senator, thank you for the question. some of the items you listed off our items we are not currently collecting. we stated in the privacy policy that if we were to that we would notify users and get consent. senator lummis: and which of those that i read have that condition? mr. beckerman: as it relates to bio metric data, everything and i would be happy to go through that with you and your team.
12:40 am
senator lun mmis: perfect, we will do that. regardless of which ones require consent, of those i mentioned, why should any member of this committee feel comfortable with the vast amounts of data your company is collecting on our children? especially since tiktok has a relationship to the chinese communist party? mr. beckerman: senator, as it relates to data, many of our peers do more. the keystroke isn't collecting what people are typing but it is an anti-fraud in anti-spam measure that measures the cadence of typing because a bot behaves differently than a human. it's not collecting what people are typing but it is a anti-fraud measure. sen. lummis. which companies that you are aware of collect machine information? mr. beckerman: i would probably refer to facebook and instagram,
12:41 am
example. >> well, i'll ask the same questions of them, thank you. beckerman: our morales are designed to keep users engaged as long as is possible? do you want to start mr beckerman? mr. beckerman: senator, we want to make sure people are having an entertaining experience like tv or movies. tiktok is meant to be entertaining . we have take a break videos and time management tools. and then family is another tool that parents can use to limit the time spent on the app. >> but is the length of engagement a tool that your company uses floral to new success. mr. beckerman: there is multiple defenses of success, senator and it's not just how much time someone is spending on the app ? >> is length of engagement one
12:42 am
of the metrics? overall engagement is more important. >> sit one of the metrics? >> it's a metric that many platforms check on. >> thank you. ms. stout, same question . are your platforms designed to keep users engaged as long as possible. miss stout: when you open snapchat, you don't open to a feed designed to keep you in more and more content. you start with a blank camera or campus where users come to talk to their friends in videos and pictures. >> but is it a metric that the company incorporates into your definition of success? >> i believe we see success when the platform is facilitating platforms. snapchat is a place where people come. >> but is it a metric? my question is, do you measure success in any way, shape, or form by how long people stay on your site? is that one of multiple driving measures of success? >> i think the way i can answer
12:43 am
that question is, is it one of many metrics. >> ok, thanks. ms. miller, same question. are your platforms designed to keep users engaged as long as is possible? ms. miller: senator, our platforms are designed to allow users to search and discover all types of content. it is intended for them to have an enjoyable experience. >> i get it but i'm asking is this one of the metrics by which you define success? >> senator, we have a number of digital well-being designed tools designed. >> is this one of the metrics. one of them. there could be numerous metrics but is this one of them? >> yeah, to the specific question you're asking, we do look, for example, income a video is watched through its entirety.
12:44 am
that helps us determine if it's a quality of video relative to the search the user had . we look at the data points to inform us as it relates to the experience the user has had on the platform. >> thank you. madame chairman, do i have time to enter an opening statement? thank you. this era of children will grow up under a level of surveillance well beyond any previous. the wall street journal reports focused on the prodromal promatic -- problematic forms of fails book, we know that the problem is children are impressionbling. children are ma nip placemented by -- manipulated by targeting advertising to them and readily influenced by the highly sophisticated algorithms that often search these invisible algorithms continually note our children in different directions which can impact their development without their knowledge and without their
12:45 am
parents' knowledge. these algorithms on these platforms were designed wit adults in mind, not children. only a tiny fraction of children understand the harm that can come from sharing sensitive information, pictures or opinions that become a part of their digital permanent record. but what is most alarming is that none of them can fully understand how the content fed to them by algorithms will shape their worldview during these formative years. so more must be done to promote responsible social media. we must educate parents on how to teach their children to avoid the pitfalls of using these platforms and more importantly, we must hold these platforms accountable for the effects they are design decisions have on our children. mr. chairman, thank you for the opportunity to add that opening statement to the record and thank you for your
12:46 am
indullingments, i yield back. >> thank you, and thank you for your leadership on these issues and your insightful questions. we do have to protect kids in our country. you just put -- put your finger on it. let me ask this everyone on th panel here today. kids are constantly posting content and their dat i've is being tracked, stored and monetized, but we know young users lack the cognitive ability to grasp that their posts are going to live online forever. to each of our witnesses, do you agree that congress should give children and teenagers, more importantly, their parents the ability, the right to erase therrian line data?
12:47 am
mr. beckerman? >> yes, senator. >> ms. stout? ms. stout: yes, we do, senator, but i must say that content on snapchat does not afear permanently. >> i appreciate that. to you, ms. miller. >> yes, senator, and children have the ability to delete their information as well as having auto delete tools. >> so you agree that they should have the right to delete it? >> yes, senator. >> today apps collect astros of information about kids that have nothing to do with the apps service. one gaming app that allows children to raise cartoon cars with animal drivers it has amassed huge amounts of kids data related to the game including location and browsing history. why do apps gobble up as much information as they can about kids? it is to make money.
12:48 am
congress, in any opinion, has to step in and collect this harmful misuse of data. ms. miller, do you agree that platform should stop data collection that has nothing to do with fulfilling the app service? ms. miller: senator, we do limit the data that we collect, and this is particularly true for the youtube kids app. it only relies on what is necessary to make sure that th platforms. >> so do you agree that that should become a law that all platforms have to do the same thing? >> i don't want to speak to whether or not it should become a law and/or the details of any proposed association, but it youtube we have not waited for a lot to make sure we have these protections. >> i appreciate that. but it's
12:49 am
time to make up your minds. yes or no on legislation. mr. beckerman. mr. beckerman: yes, senator, we do need legislation. i think we are overdue on strong national privacy laws. >> great, ms. stout? ms. stout: yes, senator, we absolutely collect data and it sounds as though collection of data that is irrelevant to the performance of the app does not appear to be within scope. >> today popular influencers peddle products online while they flaunt their lavish lifestyles to young users. online chimed selects opening new toys, getting millions of views and they are inherently manipulative to young kids who often cannot tell that they are paid advertisements. that their heroes are pushing, that their hero is getting a monetary kickback from. my bill with senator blumenthal,
12:50 am
the kids act, would banal in time of promotion of influencer marketing to kids to each of the witnesses. do you agree that congress should pass legislation to stop influencer marketing to children in our country? yes or no, ms. miller. ms. miller: senator, again, we've actually movemented in this direction whereby we have a set of quality principles regarding the type of content that is made available on the youtube kids out in so doing we make sure that we are not providing content that would have a significant prevalence of that type of material and we limit the types of ads that can be delivered. i don't have the details. >> should well make it illegal? so that people out there who might be trying to influence children know there is an enforceable penalty?
12:51 am
ms. miller: i absolutely think it's worth a discussion. we need too see the details of such a bill. >> it's been around for a long time. mr. beckerman? >> yes, senator, we already limit the kinds of advertisements, but we agree there should be additional transparency and privacy laws. >> ms. stout? >> senator, i would agree that for young people, i think there could be additional protections placed. >> by the way, we ban it on television because we know we can have a hero holding a product. thank you. and finally, push alerts. studies show that 70% of teenagers report checking social media multiple times a day, excessive use of social media has been linked to depression, anxiety, and feelings of isolation. the last thing apps should be
12:52 am
doing using methods like push notifications, automated messages that nudge users to open an app to make young users to spends even more time online. to each of the witnesses, do you agree congress should pass, again, the law that the senator and i are trying to move whic would ban push alerts for children? ms. miller? i agree additional protectios should be in place. >> thank you, mr. beckerman? >> we already limit push notifications. >> should we ban them? >> i think that would be appropriate. we've already done that. >> ms. stout?
12:53 am
>> snapchat does not use that as the uk appropriate design code. we would be in agreement with that legislation. thank you. thank you, mr. chairman. >> thank you, senator marky. i understand senator klobuchar has a few more questions and while we're waiting for her, i have a few too. so appreciate your patience. leapt me begin by acknowledging -- and i think you would acknowledge as well -- the reason that you've made many of the exchanges on and off it is -- the u.k.'s chimed safety law, the age-appropriate design cold. i think we need an america version of the british child safety law and -- about some of its provisions.
12:54 am
will you commit to supporting a child safety law that obligates companies to act in the best interests of children? and establishes, in effect, a duty of care that could be legally enforceable? ms. stout? ms. stout: senator, we were very privileged to be able to work with the information commissioner officers in the uk and their is design of the age-appropriate design cold. we, of course, complied with the cold as it's come into force this year and i mentioned we are looking actively at that code to see how we can apply it to outside the u.k. market and apply it to many of our other markets. so with respect to a child safety law that obligates companies to think about the protection and safety of children, that is something the app has done
12:55 am
without regulation and we would be happy to work with the committee. >> but the point is -- and we would love to be in a world where we could relyle on von dare actions but effect, you sacrificed that claim to von tour action or religious on your von terp action, whether it's it's facebook or your companies in various ways, i think you have shown that we can't trust big tech to police itself. so when you say we already do it, well, you may think so, but there's no legal obligation to do it and no way to hold you accountable under current law. that's what question need and
12:56 am
that's why i'm asking you about a law. i'm hoping that you would support it, that your answer is yes that you would support it as a duty of care it is a matter of law >> yes, senator, and that was very much a part of my it will in my opening statement, which because of the time it takes regulation to be implemented, we don't believe we should have to wait for that regulation. we should will take the voluntary steps to best protect. mr. beckerman: we have voluntarily implemented much of the age-appropriate design code here in the united states. i agree that companies can do more and i was struck by your comments and your opening statement about a race to the top and that is the approach we are trying to take to do more and go above and beyond and to be a place where we are putting wellness of teenagers and safety of teenagers ahead of other platforms. let me see if i can answer the question.
12:57 am
as i would if i were in your position. yes, we strongly and enthusiastically support that kind of child safety law. we are already doing more. that we would need to do under the law. mr. beckerman: yes, senator, and i do think relating to age is something that should be included in measures like that and included. >> ms. miller? >> senator, i respectfully disagree that we only wait until we have legal obligations to put systems, practices in place. for example, we rolled out youtube kids in 2015 to make sure that as kids were trying to be on the main platform that we created a space that was particularly for them and their safety. we've rod out a number of --
12:58 am
>> i'm going to interrupt you, ms. miller because i think you misinterment me. i'm not suggesting you wait, i' is suggesting you do it now. i think mr. beckerman and ms. stout perfectly understood my question. i think both of them would support that law. would you, yes or no? >> i would support looking at any details as it relates to additional legal protections for kids in the united states. as you may note the age-appropriate design code went into effect in the uk over a month ago, so it's still early days but a number of the protections required in the u.k., we have rolled them out globe lip. >> is that a yes or a no? >> is that a yes or a no? ms. miller: yes, i would be happy to support a --
12:59 am
>> would you support a version of the u.k. chimed safety law? ms. miller: i would need to stare at the details of any specific bill, but i certainly support expansions of child safety protections. i will yield to the senator. sen. klobuchar. thank you. thank you for taking me remotely for the second round. one of the things i'm tried to do with all of these hearings including in the judiciary and anti-trust subcommittee that i share is taking the veil off this idea that tennis just the web. everyone has fun and it's just a cool thing. some of that's true but it's also a huge profit-making venture, and when you look agent it that way, which is the most
1:00 am
successful and biggest companies the world has ever known in big tech platforms in terms of money --i just start with this question which i have asked many platforms, the larger platforms, ms. stout, snapp reporter user in north america for the second quarter in 2021 was $7.37. how much of your revenue came from users under the age of 18? ms. stout: i don't have that information for you but i'm
1:01 am
happy to take that back. sen. klobuchar: i appreciate that. i have been trying to get that from a just to give you a sense of facebook revenue per user for their own documents is $51 pe user in the u.s. per quarter. just to put it in some perspective. mr. beckerman, tiktok is a privately held company so we don't have public documents on your advertising revenue per user. what is your best estimate for the last quarter? >> i don't have those numbers but i'd be happy to go back and check on those. >> do you think you can provide us with that? >> again, we're not a public company but i'll go back and see what we can find for you. >> ok, again, i'm trying to figure out the users under age 18. how much of the business and future growth of the business is
1:02 am
in kids. ms. miller, youtube reported that its advertising revenue overall in the second quarter of 2021 was $7 billion. how much of youtube's rev newly came from users under the age of 18? ms. miller: senator, i don't know the answer to that and i'm not sure if we look at data internally that way. i would be happy to follow up with you. i would like to note that as a company, we have long shared our revenue with our creators so after the last three years we have paid out more than 30 billion dollars to our creators. search klobuchar: ok. my work with the antitrus subcommittee takes me in related paths and i recently introduced is the bipartisan legislation and choice online act which with senator grassley and others who
1:03 am
-- including senator bloomen that will who are co-sponsors of that legislation and it's focused only a gnarly problem, which is that you have platforms that are self referencing their own stuff at the top and taking, in some cases, data that they uniquely have on other products and making knockoff products and underpricing the competitors. ms. miller, they say youtube has made unfair demands on negotiations for caring the a.m. on ruku, including they the search results which is exactly what this legislation and gives roku access to youtube for nonpublic data.
1:04 am
nonpublic data from ruku users. did youtube make these demands? >> senator, i am not involved in the negotiations with ruku. i know we've been having discussion with them for several months and we are trying to come with to a resolution, but i'm not involved in the negotiations. sen. klobuchar: ok, i'll put this on the record for others because also i would like to know more generally if youtube has ever demanded nonpublic data or preferences in search results in negotiations with other providers. it gives you a sense of the dominant platform by far in the area of search and being able to use that power over people who are simply trying to be on that platform. i also had a follow-up on your youtube banning all vaccine misinformation. which ill commended you only at
1:05 am
the time. how much content have you removed since you banned all vaccine miss information? have you seen a change in the viewership rate? >> senator, i feel bad, this is a question i would absolutely love to answer, but i know that we have removed -- reviewed so much video as it relates to covid-19. sen. klobuchar: we'll put it in writing. my last question, ms. stout, mr. beckerman, i just mentioned this bill that we have introduced with 10 other cosponsors aimed at ensuring dominant digital platforms don't use their market power, but--. do you support some of the con competition reforms in the bill and have you faced any challenges when it comes to competing with the larges digital platforms? that's be my last question. >> senator, we are aware of the bill you introduced and as a non-dominant platform, very much
1:06 am
appreciate your legislation and the work that you are doing and as a smaller platform it is an incredibly competitive arena. we compete every day with companies that collect more information on users and store that information to monetize. any effort that this body and especially the legislation that you have undertaken to create an equal playing field so it is a competitive atmosphere we would very much welcome. sen. klobuchar. thanks. mr. becker man? mr. beckerman: likes, we appreciate your efforts and the work you have done to promote and to spur competition. that is something that we all benefit from as it relates to specific challenges we face, i would be happy to discuss in detail those issues. sen. klobuchar: thank you very much. thank you, everybody.
1:07 am
>> senator, i found the answer. if you don't mind. we've reviewed over a million videos since we started rolling out covid misinfo and over 130,000 videos as it relates to vaccine misinformation. it is an area we put a lot of effort behind. thank you. sen. klobuchar: ok, thank you. thanks, everybody. >> thank you, senator klobuchar. i seem to be the last person standing or sitting and i do have some questions. ms. stout and mr. beckerman, over 10 million teens use your app. these are extremely impressionable young people and they were a highly lucrative
1:08 am
market. and you've immediate tense of, probably hundreds of millions from them. there is a term snapchatdysmorphia. kids use the filters that are offered and create destructive, harmful expectations. and you studied the impact o these fitters on teens' mental health and i assume you study the impact of these filters before you put them in front of kids. ms. stout? ms. stout: senator, the technology that you're referring to is what we call lenses and these lenses are augmented reality filters that we allow users who choose to use them to
1:09 am
an ply them over top of selfies and for those who use them, they're the opportunities to put on a dog face or -- >> i've seen them but they also potentially change one's appearance to make them thinner, different colors, different skin tonal. ms. stout: so these filters are created both by snapchat and our creator community. there are over 5 million of these augmented reality filters and a very small percentage of those filters are what you would call beautification filters. most of them are silly, fun, entertaining filters that peopl use to lower the barrier of conversation. because, again, when you're using snap chat, you're not posting anything permanently, you are using those filters in a private way to exchange a text or a video message with a friend.
1:10 am
so it really kind of creates this fun, authentic ability to communicate with your friends in a fun way and those filters are one of the many ways in which friends love to communicate with each other. >> do you study the impact on kids before you off them? have you studied them? ms. stout: we do a considerable amount of research on our products and one of this -- i think it was one of the competitive pieces of research that was revealed as your earlier hearing shows that filters or lenses are intended to be fun and silly. it's not about -- >> we all know as parents, something intended to be fun and silly can easily become something that is dangerous and depressing. that is the simple fact about these filters. not all of them. maybe not the majority of them but some of them. and i'd like you to provide the
1:11 am
research that you've done and ask for other research but i'm gathering from what you said that you don't do the research before you provide them? ms. stout: no, senators, that's not what i said and particularly with respect to this question, i'm not able to answer the kind of research because i'm not aware. i will go back and look and tried to get you an answer to answer your question. >> you know, for eight years, snap chat happened a speed filter. it allowed users to add their speeds to videos, as you know. the result was that it encouraged teens to rails their cars at reckless speeds. there were several fatal characterization associated with teens using the speed filter. it took years if multiple deaths for snapchat to finally remove
1:12 am
that dangerous speed filter. it was silly, it was maybe fun for some people. it was catastrophic for others. and i want to raise what happened to carson rye, from derien. how carson was relentlessly bullied through anonymous apps after trying desperately to stop the abuse and pleas for help. he took his own life. silly, fun, how can parents protect kids from relentless bullying that they said so movingly no longer stops at the schoolhouse door. 24/7, it comes into their homes just before they go to sleep. what are you doing to stop
1:13 am
bullying on snapchat? ms. stout: senator, this is an incredibly moving issue for me as a parent as well, bullying is unfortunately something we see more and more happening to our kids and this is not just face it at school. they face it at home. we have zero tolerance for bullying or harassment of any kind and we see this as a responsibility to get in front of this issue and do everything we can to stop it. so again, because snapchat is designed differently, you don't have multiple abuse on snapchat. you don't have public permanent posts where people can like or comment thereby introducing additional opportunities fo public bullying or public shaming. that's not to say that bullying doesn't happen, both online and offline. so we have reporting tools where users can thomasly and quickly
1:14 am
report bullying or any other harmful activity. and i would say the trust and safety teams that work aroun the clock work to remove the content on average in less than two hurns. usually it is far more quickly than that. i want to assure you, thank you for raising the bullying issue. in addition to combating this practice we do a lot to prevent it and we think there should be more opportunities to raise awareness of the effects of bullying and the effects o words on other people and we will continue to make that commitment. >> we've heard from tiktok that it is a safe environment. at the same time we see challenges. the blackout challenge, just to take one example. where, in effect, teens and
1:15 am
children have actually died emulating and recording themselves following blackout and choking challenges. including a 9-year-old in tennessee. despite apparent efforts to discourage certain dangerous trends the content is still being posted and being viewed widely by kids online. and followed and emulated whether it was destruction of school property or other kinds of challenges. the mother who lost her child from choking shared questions with me and they are questions that deserve answers from you and others who were here today and how can parents be confident that tiktok, snapchat and youtube will not continue to host and push dangerous and deadly challenges to our kids?
1:16 am
>> thank you, senator. as it relates to this specific happen and particularly it is sad and tragic. i remember when i was in grade school, i had a classmate that we lost from something very similar. i know it is something that touches many of our lives. it is awful. as it relates to tiktok, this is not content we have been able to find on our platform. it is not content we would allow on our platform. it is important that we all remain vigilant to ensure that things that are dangerous or even deadly for teenagers don't find their way on platforms. it is important will have conversations with our teenagers to make sure they stay safe off-line and online.
1:17 am
it is a responsibility we take very seriously. we want to distinguish that there is content on platforms. if content -- we have not been able to find any evidence of a blackout challenge on tiktok at all. it would violate our guidelines. we have found absolute no evidence of it. it is something we had conversations with parents about. >> other challenges, you're saying none of them? >> anything that is illegal or dangerous violates our guidelines. all of this as things pop up on the platform, over 94% of things
1:18 am
on the 5 -- dangerous challenges have no place on tiktok. you react by taking them down but they exist for the time that they are there. what can you do to prevent those challenges from being there in the first place? >> we are able often be proactive in blocking -- we block content and remove content. something we have seen recently our press reports about alleged challenges that went fact checkers like snopes and other independent organizations looking to them, they find that these never existed on tiktok in the first place. there were hoaxes that originated on other platforms. -- they were hoaxes that originated on other platforms. we need to look at the facts and
1:19 am
see what content actually exists rather than spreading rumors about alleged challenges. >> we found pass out videos. we found them. let me ask you about another instance. they were inundated with tiktok figures about eating disorders. my staff made a tiktok account and spent hours scrolling through the endless videos. tiktok began by showing us videos.
1:20 am
we can show them in these rooms because they were so explicit. another tiktok account was flooded with nothing but sexually obscene videos. how do you explain to parents why tiktok is inundating them with videos of self injury and eating disorders? >> i can't speak to what the examples were from your staff but i can assure you that is not the normal experience that teens or people that use tiktok would get. that kind of content is removed.
1:21 am
>> would support features that lead to addictive use? >> we have already done a number of that proactively informs of take a break videos. it is important that we look at these issues and have conversations and foster these conversations with our teenagers and that's one of the reason we built i know it can be daunting and we have teenagers to have these different tools and we built this in a way where parents can have additional control of the safety and privacy for teenagers. >> i have to go vote and i don't
1:22 am
have anyone to preside here for me. i suggest taking a five minute recess. i will have some final questions if you would be willing to wait. then we can close here. thank you.
1:23 am
>> welcome back everyone. i was told that some of my colleagues wanted to come back and have an opportunity to question. if they are not here within the next five minutes, they will lose that opportunity. thank you for your patience. i wanted to give you an opportunity to answer what i think is probably one of the
1:24 am
paramount questions on the mind of most parents today. that is how are you different from facebook? i mentioned at the very start of the hearing that you would try to differentiate yourself from facebook. it is not enough to say that we are different. i want to give you the opportunity to tell parents why they should be less fearful and scared of snapchat, tiktok and youtube. >> thank you for the opportunity. i think it is very important to understand snapchat is a very different platform. it was created by our founders to be an antidote to social media. our founders saw very early on
1:25 am
what traditional social media can do to your self-esteem, to your feeling that you have to perform or be perfect for the world at all times. snapchat was a decidedly different platform that was private, safe where people could come and actually talk to the friends they have been real-life. you are not being judged on your perfect posts. they were not intended to be ephemeral the way of real-life conversation is ephemeral. it really strengthens relationships. for parents who may not be sure of snapchat, i would tell them get on snapchat, understand what your kids are doing. we are going to make it easier for parents through parental controls to understand how their kids can be safe and private on snapchat. these parental controls that will be pulling out very soon will help parents understand who their kids are talking to the most and what their childrens'
1:26 am
privacy and location settings might be. children's and parents should come back together. >> let me interrupt by saying that everything you said is fine but it is aspirational. just like facebook says we bring communities together, nothing to see here that is bad. i understand your goal and for a lot of people it may be so but i am giving you the opportunity to tell us how you protect kids in a way that facebook does not. >> it is not only aspirational but it is something we live and practice every day. in order to be friends with people you have to be bidirectional friends.
1:27 am
there is no following or not being invited into the conversations. these are two-way mutual friendships where people an average 14-year-old on snapchat only has 30 friends and we don't try to push them to make connections with strangers. we want them to be connected to real life. >> thank you. let me answer that in three ways. first, we put people first and as it relates to teens, we put their well-being first. there are some examples of things we have done. we may difficult policy and product choices that put the well-being of teens first. we don't allow direct messages for under 16. their accounts are private. the way we build that family controls. unless they're doing actions and
1:28 am
putting those actions in place, we are not talking about the aspirational things that we want to do. we're talking about things that we already have done. the approach of the company, open and humble is part of our company culture. that means we are open to getting feedback from outside experts and parents about where we can improve. we are not always going to make it -- get it right. we want to take that feedback. i encourage parents to have conversations with their teenagers. ask them how they use tiktok and how they feel afterwards. what we're hearing from the audience is that tiktok is a joyful experience and inclusive and makes teenagers feel better and difficult times in the pandemic and so have those conversations and also take the
1:29 am
opportunity to film a video to facilitate that conversation and recognize that tiktok is not a passive experience. it is about the creative it creativity and making fun videos and that is something we have seen parents come together. i think those three would answer your question. >> ms. miller. >> when users come to you too,
1:30 am
they look for all because of content but as a company, we do not prioritize profits over safety. we created a safe space for them on youtube kids. in addition, we are a transparent company. we rolled out our first transparency report in 2018 and we continue to update that on a quarterly basis where we share for additional matrix and we rolled out the view rate statistic earlier this year.
1:31 am
we believe that being a company where responsibility is our number one priority where we are constantly trying to balance the freedom of expression with being responsible is something that is ingrained in our dna and we are very proud of the work we have done, but we know there is always more for us to do. >> i am going to leave to i am going to leave to the parents of america and the world whether or not they the parents find their answers sufficient to distinguish you from facebook. i know for certainty that we need more than just, with all due respect, the answers that you have given to persuade me that we can rely on voluntary action by any of the companies.
1:32 am
i think you will hear a continuing drumbeat. i think there will be leaf and disclosure. i want to read a text. this came to my step. i want to share this. i watched the entire hearing. i heard the senator refer to this experience.
1:33 am
every time she opened the app, it was more. she could have been feeling better but these videos brought her down again and required her to spend hours making her own similar video. i say this with all due respect but i want people to know we are not taking at face value what you told us.
1:34 am
these messages are the relive experiences that people are relating to us. as we go to town meetings and talk to them on phone -- on the phone and by email, that is why you are here. i think there are still a lot of answers to be made here. i would give everyone of you an opportunity to make a clear statement if you would. >> thank you for the opportunity to appear today, i would absolutely agree with you that
1:35 am
voluntary measures are not adequate in and of themselves. we want to partner with the committee and with congress on what those regulations should look like. we believe that congress's role in pushing forward regulation in the space is necessary. thank you for the opportunity to talk to you today about how snapchat has been prioritizing and will continue to prioritize the safety and well-being of young people on our platform. we see this as the highest priority order. we will continue to press forward. >> this is one of the most important topics for all of us. we all have responsibility to protect our teenagers.
1:36 am
none of us can do it alone. it can't just be on us or congress. we have to come together and solve issues that exist both on and off-line for teachers. as we focus on legislation, we support stronger privacy rules. we have done that proactively on our own. we support congress acting on age-appropriate design, inappropriate content. we like to see congress act on that as well. >> thank you for inviting me to participate in today's hearing. i will just close by saying what i said at the top. there is no more important issue
1:37 am
than the safety of kids online. our executives are committed to learning with you to make sure that the experience kids have online is one that is healthy and enjoyable. we will continue to work to earn the trust of parents and kids. thank you very much. >> i thank each of you for being here today. we have seen millions of dollars raised against us.
1:38 am
i hope we will look back along with others that we will continue to hold. thank you for being here today. this hearing is adjourned. we will hold a hearing record open for two weeks. any senators that would like to submit questions for the record should do so by november 9. i am also going to include a letter from the national teachers association. thank you all.
1:39 am
1:40 am

27 Views

info Stream Only

Uploaded by TV Archive on