tv Snapchat Tik Tok You Tube Executives Testify on Kids Online Safety - Part... CSPAN October 26, 2021 9:05pm-11:08pm EDT
certain categories that we do not show to young adults or teenagers. chair blumenthal: so we need to go beyond privacy to tackle the design features that harm young people. to take the like buttons. senator blumenthal and i have -- i have a bill, the kids act which would ban these and other , features according to popularity. the research is clear this turns apps into virtual popularity contests. they are linked to feelings of rejection low self-worth and , depression. even youtube kids has acknowledged the problem and does not have like buttons. should congress ban features that quantify popularity for kids, yes or no? >> as i mentioned in my opening statement we have never had a like button and so we would support that. we do not think it should be a popularity context. chair blumenthal: you would
support that? >> this is more complex and i am happy to have a conversation. we have implemented much of the design code in the united states and would encourage similar measures. sen. markey: i don't know there was an answer in that. you said it is complicated. do you support banning it? >> if you want to set it by age, that is something to look at. sen. markey: ms. miller? ms. miller: we already prohibit being able to comment on youtube kids and we would support working with you in regulation. sen. markey: you would support working with us but would you support banning lights? ms. miller: senator, again, we do not allow for this on the youtube kids platform. sen. markey: again the american , academy of pediatrics just
declared a national state of emergency for children and teen mental health. we need to outlaw the online features that exacerbate this crisis. the question that we have to answer, ultimately is whether or not, for example we are going to ban autoplay for kids. the future when one video ends and another begins. kids stay glued to their phones so the apps gather data and make more money. parents are worried about screen -- 82% of parents are worried about their kids' screen time. to each of you, today, do you agree congress should ban autoplay for kids? yes or no? ms. miller, we will start with you this time. ms. miller: senator, each of the items you are outlining, we already prohibit. we have auto default set to autoplay off on kids as well as
for supervised experiences. sen. markey: would you support that being legislated? ms. miller: yes, sir. sen. markey: -- >> we have take a break videos and time management tools but for autoplay, you have to switch to the next video. sen. markey: with ban autoplay? >> we would be happy to talk to you about it. sen. markey: you don't do it. >> i think it is important as we look at the features for teens, it is something we built into tiktok proactively but as we look at legislation, i think a first step is around age verification process. sen. markey: this is a historic problem. would you support it? ms. stout: we do not have autoplay on snapchat so that is something to look at more closely and i'm not familiar with that piece, the proposal in
your legislation. sen. markey: we have work to do and we have to telescope the timeframe. chair blumenthal: thank you for your good work. senator braun when -- baldwin. sen. cantwell: thank you, mr. chairman. we all know social media offers a lot of benefits and opportunities but like has been expressed, i have concerns about the lack of transparency online and limited accountability of big companies. a major problem in social media is increasingly concerning is social media platforms usage of atherton -- algorithms to munich -- manipulate user experience, users get trapped in a filter bubble which can be troubling
for younger users. a recent wall street journal article described in detail how tiktok serves up sex and violence -- drug videos to minors. i have a bill to filter the bubble and another that would make strides in addressing the lack of transparency online and importantly the filter bubble transparency act would give the option to not be manipulated by opaque algorithms. do you believe consumers should be able to use platforms without being manipulated by algorithms designed to keep them engaged on the platform? mr. beckerman: we agree there needs to be transparency and a way algorithms work. and individual choices for those who use them. ms. miller: we provide transparency in a way that works. ms. stout: it is important to
understand what we apply algorithms to as a small subset of content. we provide transparency to users where they get to select interest categories that determine the content they are served up. it is not in a limited list or set of user generated content. it is narrow. sen. thune: i don't know if you answered the question. should consumers who use the social media platforms be able to use them without being manipulated by algorithms. ms. stout: senator, yes i agree with you. ms. miller: yes, senator. sen. thune: - mr. beckerman: sex and drugs are violations of our community guidelines and have no place in tiktok.
as it relates to the article, we disagree with that being an authentic experience. sen. thune: your platform is more driven by algorithms than any other social media platform available today, more so even than facebook. unlike facebook, tiktok's algorithm is not constrained by a social network. on july 19, 2020, the former ceo wrote that we believe all companies should disclose their algorithms, moderation policies, and data flows to regulators. tiktok also states on its website it makes tiktok source code available for testing and evaluation to get at its transparency and accountability center. has tiktok disclosed their algorithms, moderation policies, and data pools to federal or state regulators? mr. beckerman: as we pointed out, we have transparency centers and we have done may be over 100 towards with members of
senate and staff and others in the government and would be happy to be transparent of how that works. sen. thune: i think maybe centered blumenthal touched on this but in keeping with tiktok's disclosure of practices announced in july, 2020, would you commit to providing algorithms, moderation policy to the committee so we may have independent experts review them? mr. beckerman: yes. sen. thune: thank you. does youtube engage in efforts to change the attitudes and behaviors or influence its users in any way? ms. miller: when users come to youtube, they come to search and discover content like how to bake bread, watch a church service, or do exercise. as a result, they are introduced to a diversity of content that is not based on a network they are a part of. in so doing, there may be
additional videos recommended to them but those signals will be over righted to make sure we are not recommending harmful content. sen. thune: back to mr. beckerman. all chinese internet companies are held china's national intelligence law over data the government demands. that power is not limited by china's borders. as tiktok provided data to chinese governments on chinese persons living in the united states or elsewhere outside china? mr. beckerman: tiktok is not available in china and i would like to point out our servers are stored in the united states. sen. thune: this tiktok sensor videos of hangman, the famous video of the man who stood his ground in front of chinese army tanks during the 1989 tiananmen square crackdown in beijing? mr. beckerman: you can find that
content on tiktok if you search warrant. sen. thune: mr. chairman, i would suggest that as has been pointed out, i think there are a number of things we need to address. congress needs to be heard from in the space and particularly with respect to the use of algorithms and the way users are manipulated and particularly young people, i hope we can move quickly and directly and in a meaningful way to address this issue. thank you. chair blumenthal: i think we have strong consensus on that interview -- issue. similar baldwin. sen. baldwin: i would like to note that the series of hearings began with a revelation that internal research at facebook revealed the negative impact on
teenagers' body images from using the company's instagram platform and we learned, based on research by the chairman staff, how quickly someone on instagram can go from viewing content on healthy eating to being directed toward posting that focuses on unhealthy plaque to says including glorifying eating disorders. i know we do not have facebook and instagram before us today. i am particularly concerned about the impact that type of content can have on young users. i recently joined senators klobuchar and capito with whom sponsored the anna weston act to support training and education on eating disorders. a letter to facebook and instagram making more details
about how they handled this issue. i wanted to ask each of you, can you outline the steps or companies are taking to remove content that promotes unhealthy body image and eating disorders and direct users to supportive reseources instead and how are you focusing on this issue with regard to your younger users? why don't we start with mr. beckerman and tiktok? mr. beckerman: thank you, senator. i have two young but that starters and this is something i care about. we aggressively remove content for eating disorders. we work with outside groups and direct people that are seeking help and one thing we have heard his people struggling with eating disorders or other weight loss issues come to tiktok to
express that in a positive way so it has been a positive source and we do not allow ads that target people based on weight loss and that kind of content. ms. stout: thank you, senator baldwin. i want to make clear the content you described is content that glorifies eating disorders or self-harm is a complete violation of community guidelines. as i described earlier, we do not allow unvented, unmoderated content served up to our users. our media publisher, discover, which we partner on with people and publishing companies like wsj and nbc news is vetted and moderated ahead of time. sen. baldwin: can i interrupt to you and ask is that through ai or humans? ms. stout: these are hand-picked partners that snapchat has selected to say in this close garden of content, discover, we
will allow publishers and media companies to provide news, entertainment, content, espn, the washington post. users can come look at the content. it is pretty moderated and curated so it is not an unlimited source of user generated content where you can go down a rabbit hole and access that kind of hurtful, damaging content on body image. you raise an interesting question. what are the product surfacing? how are we helping users find positive resources? as a result we did conduct , research on the effects of body image and self harm and we created a product called here for you. it was created in 2020 at the height of the pandemic. when users search anorexia or eating disorders, instead of being led to content that is harmful and against our guidelines, we now surface export expert resources to show
content to help them or maybe their friends. it's a redirection of that search for potentially hurtful content that then steers the user to resources that may help them or a member of their circle of friends. ms. miller: senator we take a , comprehensive and holistic approach on topics like these. we prohibit content that promotes or glorifies things such as eating disorders. it has no place on our platform also realize users come to share stories about these experiences or to find a community. let alone find authoritative resources which is what we raise up on searches like this. in addition, we rollout programs and initiatives such as the with me cam pain.
-- campaign. we encourage users to spend their time during covid in pursuing healthy habits. we look at this holistically to make sure your tube --youtube is a platform people, and have a healthy experience. we prohibit the type of content that glorifies or promotes these issues such as eating disorders. sen. baldwin: if i could follow-up the same way as with ms. stout, when you remove content, how? do you filter that out? ? do you use artificial intelligence or a team of humans who are -- people who are looking at the content deciding whether to remove it? ms. miller: it is a mix of when we develop content policies, we rely on experts to form the
development of the policies and then we have machine learning to help us capture this type of content at scale. you'll see in our transparency report that more than 90% of content that violates communities --community guidelines are flagged by a machine and there is a mix of human reviewers. sen. baldwin: thank you. i want to thank senator blumenthal and blackburn for holding this subcommittee hearing. as witnesses can see, our colleagues are well-informed and very anxious to get legislative fixes two things they think are crucial to protecting individuals and to protecting people's privacy. i want to thank them for that. yesterday, other advice -- vice had an article called location data from gps data and apps. they are given when people have opted out.
basically, they're going to enter this for the record unless there is objection. the news highlights a stark problem that smartphone users cannot be sure if some app are respecting their preferences around data sharing. data transfer presents an issue for the location data companies themselves. these companies are reporting information about location even when people have explicitly opted out. they're continuing to collect this information. that is what the reporters and researchers and the board found. i have a question. do you believe location data is sensitive and should be collected only with consumer's consent -- consumers' consent? >> we agree. >> agree. >> yes, senator.
for users, they have access to their account under my activity and my account and can modify their settings, delete their history, and things of that nature. sen. cantwell: any federal privacy law, we should make sure that is adhered to. i see a nod. >> yes, senator. >> yes, senator. sen. cantwell: thank you. do any of you share location data with the company that is in this article, huk? their major data? >> i have never heard of that company and i'm not aware. mr. beckerman: i'm not aware of the company but we do not collect gps data. sen. cantwell: you would be affiliated with them in some way. they are getting this information in any way. . >> i am also not aware of the
company. sen. cantwell: maybe you can help us for the record on this so that we know but this is exactly what the public is frustrated about and concerned about particularly when more can be done. they go to a website and don't want my sensitive information to be shared and then there is a conglomerate of data gathered on top of that that is not honoring those wishes as it relates to the interface. this is exactly why we need a strong privacy law and why we should protect consumers on this. in the facebook hearing, we had a discussion about advertising and the issue of whether advertisers knew the exact -- exactly what the content was they were being advertised. we are also seeing a migration of major companies like procter & gamble and others moving off the internet because they are like, i'm done with it.
i do not want my ad appearing next to certain kinds of content. what was more startling is that it may be actual deceptive practices here where people are saying this content is this when in reality, it is something else and in some of these cases, objectionable hate speech with facebook and content we do not think should be online. that is how the advertisers knew. on your website, do advertisers know what content they are being placed? next to. ? >> i can respond to your question. our advertisers know where their advertising shows up and as i had mentioned, discover, the close, curated garden, those advertisements appear next to publishers and verified users we
at hants to allow to appear. on a platform like snapchat, there is no broadcast disinformation and hate speech and that is why i think it is a very appealing place for advertisers, because they know their advertisements will be placed. mr. beckerman: advertisers come to tiktok because our content is known for being so authentic and uplifting and fun. we see ads that are very much like tiktok videos. ms. miller: senator, we have worked with advertising partners over the years to make sure that they have, in the fact that advertising on youtube is safe for their clients in the same way we have worked significantly to make sure -- have a safe experience and the advertising associations have recognized our
work in the space so their brands are safe on the platform. sen. cantwell: senator lee. sen. lee: i would like to start with you if that is all right. i want to ask you a question regarding youtube's app age rating. google play has the operating set at teen, meaning 13 and up, while the apple store has it rated at 17 and up. tell me why this disparity. if apple determines the age rating for youtube ought to be 17 and up, while -- why should google determine its own app as teen? ms. miller: i am unfamiliar with the differences you outlined but i would be happy to follow up with you and your staff once i get more details.
sen. lee: i would love to know about that. it is a simple question and i understand you may not be able to answer it right now. you don't have the information but i would like to know why that difference is and whether you agree or disagree with the fact that google has rated its own app 13 and up while apple rated at 17 and up but i'm happy to follow up on that in writing or otherwise. i want to address a similar issue with regard to snapchat. snapchat is rated 12 and up on apple and teen on the google play store. any idea why there is a disparity? ms. stout: that is a good question and for some reason, i heard somewhere the reason apple was 12 and up, it is intended for a teen audience. sen. lee: why is there a
disparity between age rating and the content available on the platform? ms. miller: the content that appears on snapchat is appropriate for an age group of 13 and above. sen. lee: let's talk about that for a minute because i beg to differ. in the investigation of the discussion in the hearing, i had my staff create a snapshot account for a 13-year-old. they did not select any content preferences for the account. they earned a name, a birth year, and an email address. when it opens it, the discover page on snapchat, with the default settings, they were bombarded with content that i can most politely described as
wildly inappropriate for a child, including recommendations for among other things and invite to play an online sexualized videogame marketed itself to people who are 18 and up. tips on why you should not go to bars alone. notices for video games rated for ages 17 and up and articles about poor and stars --- porn s tars. let me remind you that this inappropriate content that has by default been recommended for a 15-year-old child is something that was sent to them by an app, using the default settings. i respectfully but strongly beg to differ on your characterization of the content it in fact -- is in fact
suitable for children 13 and up as you say. according to your website, discover is a list of recommended stories. how and why do snapchat choose the inappropriate stories to recommend to children? how does that happen and how would that happen? ms. stout: senator, allow me to explain a little bit about discover. it is a close content platform and yes, we select and hand select partners we work with. that kind of content is designed to appear on discover and resonate with an audience 13 and above. i have taken notes about what you have said that your account surface. i want to make clear that what content and community guidelines suggest that any online sexual video games should be to teens and above so i'm unclear why
that would have shown up in an account that was for 14-year-old. these community guidelines and publisher guidelines on top of those guidelines are intended to be an age-appropriate experience for a 13-year-old should -- 13-year-old. sen. lee: you have community guidelines and advertisers and media partners agreed to them. what are these additional guidelines and --i mean, i can guess only that they permit these age inappropriate articles that we shared with children. how would not not be the case? ms. stout: these additional guidelines are things that suggest they may not glorify violence, that any news articles must be fact checked, that there is no -- sen. lee: i am sure the article is about the porn stars were
accurate and i'm sure the tips on why not to go to bars alone were fact-check but that is not my question. it is whether it is appropriate for children 13 and up. ms. stout: i think this is an area we are constantly evolving and if there are senses these publishers are surfacing content to an age cohort that is inappropriate, they will be removed from our platform. sen. lee: what kind of oversight do you conduct? ms. miller:k we use human review and automated review. i would be interested in talking to you and your staff about what kind of content this was because if it violates our guidelines, this would come down. one last thing. i would agree with you when it comes to the kind of content that is promoted on discover. there is no content that is a legal. none is hurtful. it really is intended to be --
where we have control over the content that services. -- surfaces. sen. lee: i have a follow-up question. snapchat has assured it does not collect identifying data for advertising. how did snapchat decide what content is pushed to the top of their page? ms. stout: you have the ability to select preferences and there are several interest categories a user can select or unselect. they like to watch movies or they enjoy sports or they are fans of country music. at any point, it is completely transparent and a user has the ability to select what they like and that determines the content that is surfaced. after his content they do not like, they can uncheck or check and that generates the kind of content a user in discover would
see. sen. lee: thank you so much. i think we should get to the bottom of these. these apps are inappropriate and we know there is content on snapchat and youtube among other places that is not appropriate for children ages 12 or 13 and up. chair: i would say to my line of questioning and this is not appropriate to tell advertisers is a -- it is not located next to content that is inappropriate. sen. lujan: in your testimony, you mentioned that all content on snapchat on the spotlight page is human reviewed before it can be viewed by more than 25 people. yes or no. does human review help snapchat reduce the spread of potentially harmful content? ms. stout: we believe it does. sen. lujan: i appreciate
snapchat's approach to the problem. more platforms to work to stop harmful content from going viral. far too often, we find companies wanting -- once attention is diverted and the public is distracted, they do the very thing they were warning us against. can i hold you to that? will snapchat continue to keep a human in the loop before content is algorithmically promoted to large audiences? ms. stout: this is the first time i have testified before congress so please hold me to it but we have taken a human moderation first approach, not just on spotlight but across the platform. human moderation will continue to play a huge part of how we moderate content on our platform and keep users safe. sen. lujan: you see the importance of platforms taking responsibility before they amplify content and especially publish it to a mass audience
and something many of us share. they protect americans from dangerous algorithms. online platforms must be responsible and they are actively promoting hateful content. miss miller, i am grateful youtube is making an effort to be transparent regarding the numbers of users that view content and violation of community guidelines but i'm concerned with the trend. i wrote a letter to youtube with 25 colleagues on the crisis of non-english misinformation on the platform. we need to make sure all communities, no matter their language, have the same access to good, reliable information. will youtube publish its violative view rates broken down by language? ms. miller: senator, thank you for your question and what you are referring to is the latest data point from this year in
which for every 10,000 views on youtube, 19 to 21 of those views are of content that is violative and we apply our content policies at a global scale across languages. we do not preference any one language over another. this includes for the violative view rate. sen. lujan: i do monopoly that is good enough. we do not break other isms -- algorithms down across groups of people. we make existing gaps and buys his words. you see that with facial recognition technology that unfairly targeted communities of color according to reports. we see this happen right now on youtube. will youtube publish its violative view rate broken down by language? ms. miller: senator, i would be happy to follow up with you to talk through these details.
for all of our content policies and the enforcement within and the transparency we provide, it is global in scope and across languages. sen. lujan: i look forward to working with you in that space. before launching tiktok for younger users, did tiktok to any internal research to understand the impact it would have on young children? mr. beckerman: it's i am not aware that the content is curated when it comes to networks. it is age-appropriate but i'm not aware of the research. sen. lujan: i would like to follow-up. problems like tiktok can lead to addictive behavior and body image choose. -- issues. . it is critical to understand them before they take place. this is a serious issue finally getting the attention it deserves with revelations of
whistleblowers that have come forward. i urge you to take this opportunity to get an evaluation of the impact your product is having on young children and in the end, i want to follow-up on something many of us have and, as commented on leading up to these important hearings. i appreciate the chairs attention to this. both him and the ranking member have authored legislation. of the subcommittee, they have partnered on legislative initiatives. it is critically important we continue moving forward and that we mark up legislation and get something adopted and i'm hopeful that here in the u.s., we pay attention to what is happening elsewhere. europe is outpacing the united states and being responsible with legislative initiatives surrounding protecting consumers. there is no reason we cannot do that here and i want to thank the chair for the work she has done in the space.
i look forward to working with everyone to make sure we are able to get this done and the u.s. sen. cantwell: very much appreciate that and your leadership. we are awaiting the return of senator blumenthal so i can go slow. we can take a short recess because we are way past time to get over. i want to thank all of the members who have participated those far because we have had a robust discussion today. you can see this is a topic the members of the committee feel passionately about. they obviously believe there is more we need to be doing in this particular area. i appreciate everybody's attendance and focus. again, i want to thank center blumenthal and blackburn for -- senator blumenthal and senator blackburn for their leadership.
both of these hearings. for the larger committee, we had planned to move forward on many of these agenda items anyway but we appreciate the subcommittee doing some of the work and having members have a chance to have very detailed interactions on these policies that we need to take action on. i very much appreciate that. i see senator blumenthal has returned. thank you so much and i will turn it over to you. sen. blumenthal: thank you, chairman cantwell and thank you for your work on these issues. i would like to ask some additional questions on legislative proposals. one of the suggestions that
senator klobuchar raised was legal responsibilities and liabilities, which is now concluded by section 230. let me ask each of you, would you support responsible measures like the earn it act i proposed to impose some legal responsibility and liability on cutting back on the immunity that section 230 affords? ms. stout? ms. stout: we agree there should be an update to the intermediary platform liability law and in fact, the last time this body addressed a reform, which was - snuffed- out was a company that participated and helped draft legislation so we would welcome
another opportunity to work with you on that. sen. blumenthal: would you support the earn it act which senator graham and i proposed which proposes liabilities and affords victims the opportunity to take action against platforms that engage in child pornography related abuses? ms. stout: of course. we completely prohibit that kind of activity as we actively look for it. if we remove it when we find it. if you would allow me to get back to you, it has been a while since i worked on the earn it act. i recall you said senator graham introduced it but we support your legislation. sen. blumenthal: you had the opportunity to say whether you supported it and you have not. will you commit to supporting it? ms. stout: senator, again, my memory is failing me a little bit but i do believe the provisions of the earn and act
in many of the provisions we supported. i would be happy to come back with you for a more specific answer. mr. beckerman: we agree there needs to be a higher degree of accountability and responsibility particularly as it relates to content moderation that needs to be done in a way that allows platforms to moderate in an appropriate and aggressive way to make sure the kind of content none of us want to see on the internet or any of our platforms is able to be removed. sen. blumenthal: do you support changes in section 230? mr. beckerman: there can and should be changes but again, in a way that would allow companies like ours that are good actors and aggressively moderating our platform in a responsible way to continue to do so. sen. blumenthal: do you support the earn at act? mr. beckerman: we agree with the spirit and would be happy to work with you on the bill. it was reported unanimously on
the last session. it has not changed significantly. did you support it done? mr. beckerman: the concern would be unintended consequences that lead to hammering a company's ability to remove and police content on platforms it's. sen. blumenthal: is that a yes or no? mr. beckerman: maybe. sen. blumenthal: we have two maybes so far. ms. miller: i am aware of a number of proposals regarding potential updates to 230 and me and my team as well as other teams across google have been involved in the conversations regarding his proposals. we see 230 as the backbone of the internet and it is what allows us to moderate content and make sure we are taking down content that leads to
potentially eating disorders for example. what we have talked about earlier for self-harm. we want to make sure we continue to have protections so we can moderate our platforms so that they are safe and healthy for users. i am aware of the earn it act and i know our staff have been speaking but i understand there is still ongoing discussions regarding some portions of the proposal. we also very much appreciate and understand the rationale as to why this was introduced, particularly around child safety. sen. blumenthal: is that a yes or no? ms. miller: we support the goals of the earn it act but there are some details that are being discussed. sen. blumenthal: as senator markey has said, this is the talk we have seen again and again and again and again. we support the goal but that is
meaningless unless you support the legislation. it took a fight, literally, bareknuckle fight, get through legislation that made an exception for liability on human trafficking. just one, small piece of reform and i joined in a frustration of many of my colleagues that good intentions, endorsement of purposes and no substitute for actual endorsement, i would ask that each and every one of you support the earn it act but also other specific measures that will provide for legal responsibility and i think i know the claim section 230
provides a backbone but it is a backbone without real spine because all it does is support virtually limitless immunity to the internet and to the companies that are here. it is going to interrupt my second round and call on senator cruz. sen. cruz: mr. beckerman, thank you for being here today and i understand this is the first time tiktok has testified before congress and i appreciate you making the company available to answer questions. in your testimony, you talked about the things tiktok is doing to protect kids online and that is correct. i wanted to discuss the broader issue, the control the chinese communist party has on tiktok. its parent company and its
answer because you do not want to? mr. beckerman: i gave you the answer. sen. cruz: i want to be clear because you are under oath. your answer is that it is not a "other affiliate of our corporate group". this is a legal question with consequence. mr. beckerman: i understand the question. as i pointed out, tiktok is not available in china and that is an entity for purposes of the licenses of business in china. sen. cruz: you are refusing to answer the question, for the record. mr. beckerman: i believe i answered your question on my god -- question. sen. cruz: you are not willing to say yes or no. mr. beckerman: it is not a yes or no question. i want to be precise. sen. cruz: is this company another affiliate defined in your policy? mr. beckerman: the way i answered, i am not aware that is
the answer to the question. sen. cruz: so you refuse to answer the question. that does not give any confidence that tiktok is doing anything other than participating in chinese propaganda and espionage. mr. beckerman: that is not accurate and i sen. cruz: if it were not accurate, you would have answered the questions and you have dodged the questions more than anyone else i have seen in my nine years on the senate. witnesses often try to dodge questions. you refuse to answer simple questions and that, in my experience, when a witness does that, it is because they are hiding something. >> senator moran? sen. moran: mr. chairman, thank you very much. let me turn to ms. stout and ask a question about data privacy.
so, senator blumenthal and others have been working on a consumer data privacy bill for several years. i have introduced a bill that includes an appropriate scaled right for consumers to correct and erase data that is collected or processed by entities including social media companie including social media companies. ms. stout, i you think so that snap allows users to correct their user data . would you please explain the decision to proactively provide this service. >> thank you, senator, for the question. we applied the committee and your leadership on this issue. we fully support a federal comprehensive privacy bill and
we look forward to working with you and your staff on that. yes, center, snap is designed with a privacy centric focus from the outset. we do not collect a lot of date and we believe in data minimization and short data retention periods. as you pointed out, we do not store content forever and we believe in giving users transparency and control. that includes the ability to delete data, if they wish, or the ability to download data have a tool in the app that allows users to download data. it gives them any information they may have agreed to share, post, or put onto there snapchat account.
and other platforms who may not take the same position that snap has, what you give up in your ability to earn revenue? what is it that you lose by doing that, if anything? >> we make tradeoffs every day that sometimes disadvantage are bottom line. there are no rules and regulations that require companies like snap to have short retention. or why we choose to voluntarily have a data minimization practice. means advertisers find other platforms perhaps more enticing those platforms keep a history of anything that has been shared or searched or location data that's ever been provided and that's not the case on snapchat. that is a trade-off, senator, because we believe in being more private and more safe. if you did otherwise, what wd you gain by doing so? >> we just believe that we have a moral responsibility to limit the data we collect on people. >> your answer is fine, but what do you generate by keeping the data and in some other way using it? >> we limit ourselves in our ability to optimize those advertisements to make more money. we are a company that has not
yet turned a profit. we have reinvested every dollar into the company and we are here for gain . our desire is to make a platform that is safe for our community. >> ity, and mr. beckerman, what responsibility do platforms have to prevent harmful social media trends from spreading? and how can tiktok improve its algorithm to better comply with the -- their own terms of service to prevent criminal activity? mr. becker man: thank you, senator. we do have a responsibility to moderate our platform along with community guidelines and to do that in a transparent way. >> where does that come from? mr. becker man: it comes from doing the right thing. we want to be a transparent platform where people enjoy their experience. that starts with community guidelines.
certain content that we would not allow on the platform, we work really hard, i think. the safety teams are often the unsung heroes of the company. they work every day 24/7 to make sure community guidelines are met and that the platform stays positive and joyful. >> the nature of my second question, the add-on-question was what's leading because it suggests that you are not complying with your own terms of service that enable criminal activity . my question is, how can you improve your algorithm to better accomplish that? maybe you would discount the premise of the question. mr. becker man: no, senator, it's an important area where we want to get to 100%. we release transparency reports and 94% of our removal of vile active content are done proactivity. much of it is done within 24 hours before there are views. we strive for 100% and that is something we fight and work on every day. sen. moran: thank you.
thank you, chairman, and ranking member senator blackburn. this subcommittee is important to the nature of our country and to its future. thank you. >> thank you, senator moran. >> thank you, mr. chairman. ms. miller, i would like to come to you . you talk about moderation as a combination of machine learning and human review. it seems that youtube has no problem pulling down videos that question abortion or global warming or vaccine mandates. but child abuse videos remain on your site. so i am interested to know more about the specific inputs that you use in these reviews. who establishes these inputs? who oversees them to make sure that you get them right? >> senator, thank you for your question. so, we heavily invest in making sure that all of our users and
particularly kids on the platform have a safe experience. the way we do this is with a number of levers. for example, we have content policies as it relates to child safety on the platform. so, not putting situations and videos on the plat form. sen. blackburn. let me jump in, then, and and you, if you don't want to put children into risky videos -- there is a world of self-harm content on your site. a few searches come up with videos, and i am quoting from searches that we have done. songs to slit your wrist by. vertical shildt wrists. how to slit your wrist. and painless ways to commit suicide. now, that last video, painless
ways was age gated. but do the self-harm and suicide videos violate youtube tees content guidelines, if you're saying you have these guidelines? >> senator, i would certainly like following up with you on the video you may be referencing. we absolutely prohibit content regarding suicide or self-harm. sen. blackburn: i have to tell you, we have pulled these down in our office. our team has worked on this because i think it is imperative that we take the steps that are necessary to prevent children and teens from seeing this content. and, i just can't imagine that you all are continuing to allow children to figure out how to do this on your site. how to carry out self-harm. yes, why don't you follow up with me for more detail and i would like the response in
writing. i also talked to a film producer friend this morning and the film trailer for i am not ashamed, which was based on the story of rachel scott who was the first victim of the columbine attacks. the film focused on her fait and how it helped in her life. so why would you remove this film trailer and block its distributor from being on your site and you did this for 11 months. you did not put the trailer bac up until a hollywood reporter called and said, why have you done this? do you have an answer on that one? >> senator, i'm sorry, but i'm not familiar with that specific -- senator blackburn. : ok, then let's review this.
and we sub it in -- submit the documentation that i had sent over to me. ms. stout, over to you . we had an issue in memphis with a 48-year-old man who was on your site. re-- he related to a 16-year-old memphis teenage. he claimed to be a music producer and he lured her into this relationship. one of the news articles recently called snapchat the app of choice for sexual predators. this is something that is of tremendous concern. much of it, from what i understand from talking to mom's and grandmom's is they use the snap map location service . i know you are probably going to say that only your friends can follow you. somehow people are getting around that. and these sexual principled torreyes, who are following young people are using this map to get to their location. so, we had this in memphis with the rape.
we have another child that was in the middle part of the state that the predator followed her and she tried to commit suicide because she knew her family was going to find out. so are you taking steps? do you want to give me a written answer as to the steps you all are taking? how are you going to get a handle on this? this is endangering young women. >> senator, i'm more than happy to give you a detailed written answer. i appreciate your leadership and following up on this issue want issue. i want to make crystal clear that the x-play tuition of minors is deplorable and we work to prevent this from happening on our platform. with respect to the map, yes, indeed, location -- appearing on the map is off, by default, for everyone. not only for minors but for everyone. so
in order to appear to someone and share your location, you must be bidirectional friends with that person. ally say, with respect to grooming, this is an area where we spent time and resources to prevent. snapchat makes it intentionally difficult for strangers to find people they do not know. we don't have open profiles or browsable pictures. we don't have the ability to understand who people's friends are and where they go to scoop. i'd be more than happy to give you more adam. sen. blackburn: let's get some more detail. and one question for all three of you and you can answer in writing, if you choose. you have all talked about the research work that you do. do you get parental consent when doing research on children? and can you provide us a copy of the parental consent form? we asked facebook for this and they punted the question repeatedly. and ms. miller, i think you need
to provide the committee clarity. you said you had never spoken out against online privacy. i think the chairman and i may question that a little bit. my question to you, and you can come back to us with more depth on this. did you fight it as part of the internet association as they were fighting privacy on your behalf? and just one thing to wrap up mr. chairman, going bac to mr. beckerman. there is confusion around the ownership of tiktok with the parent company as bytedance . they have a financial stake in
bytedance. douyin has an interest. i check could and know that tiktok data is stored in the u.s. and singapore. until recently, singapore dat centers were run by alibaba, which is another chinese company. what we need from you mr. beckerman is the clarity on ownership. the sharing processes around u.s. data services especially the information of our children. with that i will yield back. thank you, mr. chairman. >> thank you, sen. blackburn. wi going to finish the first round. senator soul vandal. i am going to go vote and i should center finishes.
in the meantime, senator blackburn will preside. thank you. >> thank you, mr. chairman. mr beckerman, i know there is sharing of data with the chinese communist party. given the own everyship or at least board influence. senator blackburn was just talking about that. i-know senator cruz raiseeds the issues. i want to raise a related sufficient. it's what i refer to as kowtow capitalism and i think that you guys are exhibits a of cow toy -- kowtow capitalism? what is that? it's american executives of american companies censorin americans' first amendment rights in america so as not to offend the chinese government and/or to
gain access to the chinese government. we see it on wall street and in the nba. we see it in the movie industry. let me ask a couple of questions regarding that. a tiktok user could put up a video that criticizes the chairman of this committee and the ranking member or any senator. president biden or former president trump. couldn't they? not like some horrible violent suggestion but just a criticism of an elected official. is that common? mr. beckerman: actually, tiktok doesn't lock up political ads so political contestant. i. wouldn't be a violation of your community guideline as long as it's not disinformation or something hateful. >> good, that's free speech.
that is free speech. could a tiktok user put up a video criticizing xi jinping paying. he has been compared to winni the pooh. could a tiktok user reference him with winy the pooh? i don't know why he doesn't lining winnie the pooh. could a tiktok user do that? mr. beckerman: yes, senator. community guidelines are done by our team in california and the moderation is done here. >> ok, in 2019 you admitted to sensorring videos ordering tibetable independence. can a tiktok user mentioned tibetan independence?
mr. beckerman: yes, senator. >> what happened in 2019? mr. beckerman: i can assure you it's not a violation of the community guidelines and would be allowed on the platform. >> i think there was a tiktok public policy director in 2020 that admitted that tiktok had previously censored information from the ccp about forced labor , is that true? mr. beckerman: that would not be in violation of community guidelines and it would be permitted. >> so you're saying that your videos have never been censored by the chinese communist party on any matter? i'm just reading these and maybe they're all wrong. mr. beckerman: i can assure you that our content moderation teams are led by americans. our mission guidelines are public and transparent. and content that is critical of any government frankly, as long as there is not disinformation or hateful speech, would be allowed.
and i would enkoenning counsel you to search for a number of the examples today up mentioneds on tiktok and i'm sure you could find them. >> so the tibetan and -- independence and the muslim forced labor issues are not censored? >> not currently. >> were they previously? i'm reading here that in 2019 and 2020 that somebody wit admitted doing that. >> i'm not aware of that and that's not how we monitor content. >> listen, i do think this issue of kowtow capitalism where american companies are censoring americans that we see all the time is an issue they should be looking at. because the chinese communist parties, of course, can crush freedom of speech in their own country but they shouldn't be
able to crush it in this country. and mr. beckerman, i'm glad that you're denying any of in. i look forward to seeing videos that are negative against the chinese communist party. not really holding my breath but maybe it's true. maybe they're fine with videos criticizing xi jinping and other members of the communist party. so you're saying that's fine and acceptable policy for tiktok users? >> yes, and just to be clear, there's not involvement from the chinese communist party in tiktok moderation and it's all done in the united states. >> thank you. >> thank you, mr. sullivan, and i will -- we are going to continue looking at these issues on the chinese
communist party teats influence into u.s. technology and into the silencing and sense urining of free speech of u.s. citizens online. >> great, it's very important, i appreciate that. senator, you are recognizeeds. >> thank you. i will start directly with questions and if i have time left, i would like to read a statement into the record. i will start with a question for ms. miller. youtube has implemented several features, such as auto play, that make the platform difficult to stop using. what mechanisms has you to employee to make sure that children, specifically, can counteract these design decisions? and do you believe those controls are sufficient? >> senator, thank you for your question.
autoplay's default off on youtube supervised experiences on youtube. we have set those two default off. we do allow that if the default is changed to allow for autoplay. for example if a family is in car and the parents have decided they want autoplay to continue. but we have set it to a default of off. >> do you believe that's sufficient? >> i think it's one of a number of tools that are important to make sure that kids have a healthy and safe experience on the platform. another is that we do not deliver targeted advertising on youtube kids. another is that we age gate content to make sure that minors do not see ainge-inappropriate material. it's only with a number of tools and protocols in place that do we think we are meeting the bar we have set for ourselves and the parents expect of us. experts in the fields of child development advise us to make
relates to data, many of our peers do more. the keystroke isn't collecting what people are typing but it is an anti-fraud in anti-spam measure that measures the cadence of typing because a bot behaves differently than a human. it's not collecting what people are typing but it is a anti-fraud measure. sen. lummis. which companies that you are aware of collect machine information? mr. beckerman: i would probably refer to facebook and instagram, example. >> well, i'll ask the same questions of them, thank you. beckerman: our morales are designed to keep users engaged as long as is
possible? do you want to start mr beckerman? mr. beckerman: senator, we want to make sure people are having an entertaining experience like tv or movies. tiktok is meant to be entertaining . we have take a break videos and time management tools. and then family is another tool that parents can use to limit the time spent on the app. >> but is the length of engagement a tool that your company uses floral to new success. mr. beckerman: there is multiple defenses of success, senator and it's not just how much time someone is spending on the app ? >> is length of engagement one of the metrics? overall engagement is more important. >> sit one of the metrics? >> it's a metric that many platforms check on. >> thank you. ms. stout, same question . are your platforms designed to keep users engaged as long as possible.
miss stout: when you open snapchat, you don't open to a feed designed to keep you in more and more content. you start with a blank camera or campus where users come to talk to their friends in videos and pictures. >> but is it a metric that the company incorporates into your definition of success? >> i believe we see success when the platform is facilitating platforms. snapchat is a place where people come. >> but is it a metric? my question is, do you measure success in any way, shape, or form by how long people stay on your site? is that one of multiple driving measures of success? >> i think the way i can answer that question is, is it one of many metrics. >> ok, thanks. ms. miller, same question. are your platforms designed to keep users engaged as long as is possible? ms. miller: senator, our platforms are designed to allow users to search and discover all types of content. it is intended for them to have an enjoyable experience. >> i get it but i'm asking is
this one of the metrics by which you define success? >> senator, we have a number of digital well-being designed tools designed. >> is this one of the metrics. one of them. there could be numerous metrics but is this one of them? >> yeah, to the specific question you're asking, we do look, for example, income a video is watched through its entirety. that helps us determine if it's a quality of video relative to the search the user had . we look at the data points to inform us as it relates to the experience the user has had on the platform. >> thank you. madame chairman, do i have time to enter an opening statement? thank you. this era of children will grow up under a level of surveillance well beyond any previous. the wall street journal reports focused on the prodromal
promatic -- problematic forms of fails book, we know that the problem is children are impressionbling. children are ma nip placemented by -- manipulated by targeting advertising to them and readily influenced by the highly sophisticated algorithms that often search these invisible algorithms continually note our children in different directions which can impact their development without their knowledge and without their parents' knowledge. these algorithms on these platforms were designed wit adults in mind, not children. only a tiny fraction of children understand the harm that can come from sharing sensitive information, pictures or opinions that become a part of their digital permanent record. but what is most alarming is that none of them can fully
understand how the content fed to them by algorithms will shape their worldview during these formative years. so more must be done to promote responsible social media. we must educate parents on how to teach their children to avoid the pitfalls of using these platforms and more importantly, we must hold these platforms accountable for the effects they are design decisions have on our children. mr. chairman, thank you for the opportunity to add that opening statement to the record and thank you for your indullingments, i yield back. >> thank you, and thank you for your leadership on these issues and your insightful questions. we do have to protect kids in our country. you just put -- put your finger on it. let me ask this everyone on th
panel here today. kids are constantly posting content and their dat i've is being tracked, stored and monetized, but we know young users lack the cognitive ability to grasp that their posts are going to live online forever. to each of our witnesses, do you agree that congress should give children and teenagers, more importantly, their parents the ability, the right to erase therrian line data? mr. beckerman? >> yes, senator. >> ms. stout? ms. stout: yes, we do, senator, but i must say that content on snapchat does not afear permanently. >> i appreciate that. to you, ms. miller. >> yes, senator, and children
have the ability to delete their information as well as having auto delete tools. >> so you agree that they should have the right to delete it? >> yes, senator. >> today apps collect astros of information about kids that have nothing to do with the apps service. one gaming app that allows children to raise cartoon cars with animal drivers it has amassed huge amounts of kids data related to the game including location and browsing history. why do apps gobble up as much information as they can about kids? it is to make money. congress, in any opinion, has to step in and collect this harmful misuse of data. ms. miller, do you agree that platform should stop data collection that has nothing to do with fulfilling the app service? ms. miller: senator, we do limit the data that we collect, and this is particularly true for
the youtube kids app. it only relies on what is necessary to make sure that th platforms. >> so do you agree that that should become a law that all platforms have to do the same thing? >> i don't want to speak to whether or not it should become a law and/or the details of any proposed association, but it youtube we have not waited for a lot to make sure we have these protections. >> i appreciate that. but it's time to make up your minds. yes or no on legislation. mr. beckerman. mr. beckerman: yes, senator, we do need legislation. i think we are overdue on strong national privacy laws. >> great, ms. stout? ms. stout: yes, senator, we absolutely collect data and it sounds as though collection of data that is irrelevant to the performance of the app does not appear to be within scope.
>> today popular influencers peddle products online while they flaunt their lavish lifestyles to young users. online chimed selects opening new toys, getting millions of views and they are inherently manipulative to young kids who often cannot tell that they are paid advertisements. that their heroes are pushing, that their hero is getting a monetary kickback from. my bill with senator blumenthal, the kids act, would banal in time of promotion of influencer marketing to kids to each of the witnesses. do you agree that congress should pass legislation to stop influencer marketing to children in our country? yes or no, ms. miller. ms. miller: senator, again, we've actually movemented in this direction whereby we have a
set of quality principles regarding the type of content that is made available on the youtube kids out in so doing we make sure that we are not providing content that would have a significant prevalence of that type of material and we limit the types of ads that can be delivered. i don't have the details. >> should well make it illegal? so that people out there who might be trying to influence children know there is an enforceable penalty? ms. miller: i absolutely think it's worth a discussion. we need too see the details of such a bill. >> it's been around for a long time. mr. beckerman? >> yes, senator, we already limit the kinds of advertisements, but we agree there should be additional transparency and privacy laws. >> ms. stout?
>> senator, i would agree that for young people, i think there could be additional protections placed. >> by the way, we ban it on television because we know we can have a hero holding a product. thank you. and finally, push alerts. studies show that 70% of teenagers report checking social media multiple times a day, excessive use of social media has been linked to depression, anxiety, and feelings of isolation. the last thing apps should be doing using methods like push notifications, automated messages that nudge users to open an app to make young users to spends even more time online. to each of the witnesses, do you agree congress should pass, again, the law that the senator and i are trying to move whic would ban push alerts for
children? ms. miller? i agree additional protectios should be in place. >> thank you, mr. beckerman? >> we already limit push notifications. >> should we ban them? >> i think that would be appropriate. we've already done that. >> ms. stout? >> snapchat does not use that as the uk appropriate design code. we would be in agreement with that legislation. thank you. thank you, mr. chairman. >> thank you, senator marky. i understand senator klobuchar has a few more questions and while we're waiting for her, i have a few too. so appreciate your patience.
leapt me begin by acknowledging -- and i think you would acknowledge as well -- the reason that you've made many of the exchanges on and off it is -- the u.k.'s chimed safety law, the age-appropriate design cold. i think we need an america version of the british child safety law and -- about some of its provisions. will you commit to supporting a child safety law that obligates companies to act in the best interests of children? and establishes, in effect, a duty of care that could be legally enforceable? ms. stout? ms. stout: senator, we were very
privileged to be able to work with the information commissioner officers in the uk and their is design of the age-appropriate design cold. we, of course, complied with the cold as it's come into force this year and i mentioned we are looking actively at that code to see how we can apply it to outside the u.k. market and apply it to many of our other markets. so with respect to a child safety law that obligates companies to think about the protection and safety of children, that is something the app has done without regulation and we would be happy to work with the committee. >> but the point is -- and we would love to be in a world where we could relyle on von dare actions but effect, you sacrificed that claim to von tour action or religious on your
von terp action, whether it's it's facebook or your companies in various ways, i think you have shown that we can't trust big tech to police itself. so when you say we already do it, well, you may think so, but there's no legal obligation to do it and no way to hold you accountable under current law. that's what question need and that's why i'm asking you about a law. i'm hoping that you would support it, that your answer is yes that you would support it as a duty of care it is a matter of law >> yes, senator, and that was very much a part of my it will in my opening statement, which because of the time
it takes regulation to be implemented, we don't believe we should have to wait for that regulation. we should will take the voluntary steps to best protect. mr. beckerman: we have voluntarily implemented much of the age-appropriate design code here in the united states. i agree that companies can do more and i was struck by your comments and your opening statement about a race to the top and that is the approach we are trying to take to do more and go above and beyond and to be a place where we are putting wellness of teenagers and safety of teenagers ahead of other platforms. let me see if i can answer the question. as i would if i were in your position. yes, we strongly and enthusiastically support that kind of child safety law. we are already doing more. that we would need to do under the law. mr. beckerman: yes, senator, and i do think relating to age is something that should be included in measures like that
and included. >> ms. miller? >> senator, i respectfully disagree that we only wait until we have legal obligations to put systems, practices in place. for example, we rolled out youtube kids in 2015 to make sure that as kids were trying to be on the main platform that we created a space that was particularly for them and their safety. we've rod out a number of -- >> i'm going to interrupt you, ms. miller because i think you misinterment me. i'm not suggesting you wait, i' is suggesting you do it now. i think mr. beckerman and ms. stout perfectly understood my question. i think both of them would support that law. would you, yes or no?
>> i would support looking at any details as it relates to additional legal protections for kids in the united states. as you may note the age-appropriate design code went into effect in the uk over a month ago, so it's still early days but a number of the protections required in the u.k., we have rolled them out globe lip. >> is that a yes or a no? >> is that a yes or a no? ms. miller: yes, i would be happy to support a -- >> would you support a version of the u.k. chimed safety law? ms. miller: i would need to stare at the details of any specific bill, but i certainly support expansions of child safety protections. i will yield to the senator. sen. klobuchar. thank you. thank you for taking me remotely for the second round.
one of the things i'm tried to do with all of these hearings including in the judiciary and anti-trust subcommittee that i share is taking the veil off this idea that tennis just the web. everyone has fun and it's just a cool thing. some of that's true but it's also a huge profit-making venture, and when you look agent it that way, which is the most successful and biggest companies the world has ever known in big tech platforms in terms of money --i just start with this question which i have asked many platforms, the larger platforms, ms. stout, snapp reporter user in north america for the second quarter in 2021 was $7.37. how much of your revenue came from users under the age of 18?
ms. stout: i don't have that information for you but i'm happy to take that back. sen. klobuchar: i appreciate that. i have been trying to get that from a just to give you a sense of facebook revenue per user for their own documents is $51 pe user in the u.s. per quarter. just to put it in some perspective. mr. beckerman, tiktok is a privately held company so we
don't have public documents on your advertising revenue per user. what is your best estimate for the last quarter? >> i don't have those numbers but i'd be happy to go back and check on those. >> do you think you can provide us with that? >> again, we're not a public company but i'll go back and see what we can find for you. >> ok, again, i'm trying to figure out the users under age 18. how much of the business and future growth of the business is in kids. ms. miller, youtube reported that its advertising revenue overall in the second quarter of 2021 was $7 billion. how much of youtube's rev newly came from users under the age of 18?
ms. miller: senator, i don't know the answer to that and i'm not sure if we look at data internally that way. i would be happy to follow up with you. i would like to note that as a company, we have long shared our revenue with our creators so after the last three years we have paid out more than 30 billion dollars to our creators. search klobuchar: ok. my work with the antitrus subcommittee takes me in related paths and i recently introduced is the bipartisan legislation and choice online act which with senator grassley and others who -- including senator bloomen that will who are co-sponsors of that legislation and it's focused only a gnarly problem, which is that you have platforms that are self referencing their own stuff at the top and taking, in some cases, data that they uniquely have on other products and making knockoff products and underpricing the competitors.
ms. miller, they say youtube has made unfair demands on negotiations for caring the a.m. on ruku, including they the search results which is exactly what this legislation and gives roku access to youtube for nonpublic data. nonpublic data from ruku users. did youtube make these demands? >> senator, i am not involved in the negotiations with ruku. i know we've been having discussion with them for several months and we are trying to come with to a resolution, but i'm not involved in the negotiations. sen. klobuchar: ok, i'll put this on the record for others because also i would like to know more generally if
youtube has ever demanded nonpublic data or preferences in search results in negotiations with other providers. it gives you a sense of the dominant platform by far in the area of search and being able to use that power over people who are simply trying to be on that platform. i also had a follow-up on your youtube banning all vaccine misinformation. which ill commended you only at the time. how much content have you removed since you banned all vaccine miss information? have you seen a change in the viewership rate? >> senator, i feel bad, this is a question i would absolutely love to answer, but i know that we have removed -- reviewed
so much video as it relates to covid-19. sen. klobuchar: we'll put it in writing. my last question, ms. stout, mr. beckerman, i just mentioned this bill that we have introduced with 10 other cosponsors aimed at ensuring dominant digital platforms don't use their market power, but--. do you support some of the con competition reforms in the bill and have you faced any challenges when it comes to competing with the larges digital platforms? that's be my last question. >> senator, we are aware of the bill you introduced and as a non-dominant platform, very much appreciate your legislation and the work that you are doing and as a smaller platform it is an incredibly competitive arena. we compete every day with companies that collect more information on users and store that information to monetize.
any effort that this body and especially the legislation that you have undertaken to create an equal playing field so it is a competitive atmosphere we would very much welcome. sen. klobuchar. thanks. mr. becker man? mr. beckerman: likes, we appreciate your efforts and the work you have done to promote and to spur competition. that is something that we all benefit from as it relates to specific challenges we face, i would be happy to discuss in detail those issues. sen. klobuchar: thank you very much. thank you, everybody. >> senator, i found the answer. if you don't mind. we've reviewed over a million videos since we started rolling out covid misinfo and over 130,000 videos as it relates to vaccine misinformation. it is an area we put a lot of effort behind. thank you.
sen. klobuchar: ok, thank you. thanks, everybody. >> thank you, senator klobuchar. i seem to be the last person standing or sitting and i do have some questions. ms. stout and mr. beckerman, over 10 million teens use your app. these are extremely impressionable young people and they were a highly lucrative market. and you've immediate tense of, probably hundreds of millions from them. there is a term snapchatdysmorphia. kids use the filters that are
offered and create destructive, harmful expectations. and you studied the impact o these fitters on teens' mental health and i assume you study the impact of these filters before you put them in front of kids. ms. stout? ms. stout: senator, the technology that you're referring to is what we call lenses and these lenses are augmented reality filters that we allow users who choose to use them to an ply them over top of selfies and for those who use them, they're the opportunities to put on a dog face or -- >> i've seen them but they also potentially change one's appearance to make them thinner, different colors, different skin
tonal. ms. stout: so these filters are created both by snapchat and our creator community. there are over 5 million of these augmented reality filters and a very small percentage of those filters are what you would call beautification filters. most of them are silly, fun, entertaining filters that peopl use to lower the barrier of conversation. because, again, when you're using snap chat, you're not posting anything permanently, you are using those filters in a private way to exchange a text or a video message with a friend. so it really kind of creates this fun, authentic ability to communicate with your friends in a fun way and those filters are one of the many ways in which friends love to communicate with each other. >> do you study the impact on kids before you off them? have you studied them? ms. stout: we do a considerable amount of
research on our products and one of this -- i think it was one of the competitive pieces of research that was revealed as your earlier hearing shows that filters or lenses are intended to be fun and silly. it's not about -- >> we all know as parents, something intended to be fun and silly can easily become something that is dangerous and depressing. that is the simple fact about these filters. not all of them. maybe not the majority of them but some of them. and i'd like you to provide the research that you've done and ask for other research but i'm gathering from what you said that you don't do the research before you provide them? ms. stout: no, senators, that's not what i said and particularly with respect to this question, i'm not able to answer the kind of research because i'm not aware. i will go back and look and
tried to get you an answer to answer your question. >> you know, for eight years, snap chat happened a speed filter. it allowed users to add their speeds to videos, as you know. the result was that it encouraged teens to rails their cars at reckless speeds. there were several fatal characterization associated with teens using the speed filter. it took years if multiple deaths for snapchat to finally remove that dangerous speed filter. it was silly, it was maybe fun for some people. it was catastrophic for others. and i want to raise what happened to carson rye, from
derien. how carson was relentlessly bullied through anonymous apps after trying desperately to stop the abuse and pleas for help. he took his own life. silly, fun, how can parents protect kids from relentless bullying that they said so movingly no longer stops at the schoolhouse door. 24/7, it comes into their homes just before they go to sleep. what are you doing to stop bullying on snapchat? ms. stout: senator, this is an incredibly moving issue for me as a parent as well, bullying is unfortunately something we see more and more happening to our kids and this is not just face it at school. they face it at home. we have zero tolerance for bullying or harassment of any kind and we see this as a
responsibility to get in front of this issue and do everything we can to stop it. so again, because snapchat is designed differently, you don't have multiple abuse on snapchat. you don't have public permanent posts where people can like or comment thereby introducing additional opportunities fo public bullying or public shaming. that's not to say that bullying doesn't happen, both online and offline. so we have reporting tools where users can thomasly and quickly report bullying or any other harmful activity. and i would say the trust and safety teams that work aroun the clock work to remove the content on average in less than two hurns. usually it is far more quickly than that. i want to assure you, thank you for raising the bullying issue.
in addition to combating this practice we do a lot to prevent it and we think there should be more opportunities to raise awareness of the effects of bullying and the effects o words on other people and we will continue to make that commitment. >> we've heard from tiktok that it is a safe environment. at the same time we see challenges. the blackout challenge, just to take one example. where, in effect, teens and children have actually died emulating and recording themselves following blackout and choking challenges. including a 9-year-old in tennessee. despite apparent efforts to discourage certain dangerous trends the content is still being posted and being viewed
widely by kids online. and followed and emulated whether it was destruction of school property or other kinds of challenges. the mother who lost her child from choking shared questions with me and they are questions that deserve answers from you and others who were here today and how can parents be confident that tiktok, snapchat and youtube will not continue to host and push dangerous and deadly challenges to our kids? >> thank you, senator. as it relates to this specific happen and particularly it is sad and tragic. i remember when i was in grade school, i had a classmate that we lost from something very similar. i know it is something that
touches many of our lives. it is awful. as it relates to tiktok, this is not content we have been able to find on our platform. it is not content we would allow on our platform. it is important that we all remain vigilant to ensure that things that are dangerous or even deadly for teenagers don't find their way on platforms. it is important will have conversations with our teenagers to make sure they stay safe off-line and online. it is a responsibility we take very seriously. we want to distinguish that there is content on platforms. if content -- we have not been able to find any evidence of a blackout challenge on tiktok at all. it would violate our guidelines.
we have found absolute no evidence of it. it is something we had conversations with parents about. >> other challenges, you're saying none of them? >> anything that is illegal or dangerous violates our guidelines. all of this as things pop up on the platform, over 94% of things on the 5 -- dangerous challenges have no place on tiktok. you react by taking them down but they exist for the time that they are there. what can you do to prevent those challenges from being there in the first place? >> we are able often be
proactive in blocking -- we block content and remove content. something we have seen recently our press reports about alleged challenges that went fact checkers like snopes and other independent organizations looking to them, they find that these never existed on tiktok in the first place. there were hoaxes that originated on other platforms. -- they were hoaxes that originated on other platforms. we need to look at the facts and see what content actually exists rather than spreading rumors about alleged challenges. >> we found pass out videos. we found them. let me ask you about another
instance. they were inundated with tiktok figures about eating disorders. my staff made a tiktok account and spent hours scrolling through the endless videos. tiktok began by showing us videos. we can show them in these rooms because they were so explicit. another tiktok account was flooded with nothing but sexually obscene videos.
how do you explain to parents why tiktok is inundating them with videos of self injury and eating disorders? >> i can't speak to what the examples were from your staff but i can assure you that is not the normal experience that teens or people that use tiktok would get. that kind of content is removed. >> would support features that lead to addictive use? >> we have already done a number of that proactively informs of
take a break videos. it is important that we look at these issues and have conversations and foster these conversations with our teenagers and that's one of the reason we built i know it can be daunting and we have teenagers to have these different tools and we built this in a way where parents can have additional control of the safety and privacy for teenagers. >> i have to go vote and i don't have anyone to preside here for me. i suggest taking a five minute recess. i will have some final questions if you would be willing to wait. then we can close here. thank you.