tv Facebook Whistleblower Testifies on Protecting Kids Online CSPAN October 6, 2021 1:01am-2:44am EDT
he is on to talk about his new book come up. watch c-span's washington journal, live at 7:00 eastern, wednesday morning. sure to join the discussion with your facebook comments, text messages and tweets. >> facebook whistleblower frances haugen testified at eight senate hearing about her experience at the social media company and how to protect people -- children online. as long as facebook is hiding its -- the subcommittee hearing is three hours. >> the meeting and hearing for
the subcommittee will come to order. i am very pleased to welcome my colleagues, and i want to thank ranking member senator blackburn for her cooperation and collaboration. the ranking member who is here, senator wicker, as well as our chairwoman, maria cantwell, senator cantwell i am sure will be here shortly. i would like to thank our witness, frances haugen , for being here, and the council who are representing her today. i want to give you my heartfelt gratitude for your courage and strength in coming forward, as you have done standing up to one the most powerful corporate giants in the history of the
world, without any exaggeration. you have a compelling, credible voice, we have heard already, but you are not here alone. you are armed with document and evidence, and you speak volumes, as they do, about how facebook has put profits ahead of people. among other revelations, the information you have provided to congress is powerful proof that facebook new its products -- knew its products were harming teenagers. there is a question, which i hope you will discuss, as to whether there is such a thing as a safe algorithm.
facebook sought teens creating secret accounts that are often hidden from their parents as unique value proposition, in their role -- words, a unique value proposition, driving up shareholders at this expense of safety. it doubled down on targeting children, pushing products onto pre-teenagers that it knows are harmful to our kids' mental health and well-being. instead of telling parents, facebook concealed the fact and stonewalled and blocked this information from the coming public, including to this committee when senator blackburn and i specifically asked the company. and still even now, as of just last thursday, when a facebook witness came forward and told us
when it might decide to disclose additional documents. they continued their tactics even after they knew the disruption. they continue to profit from them. the prophet was more important than the pain. last thursday, the facebook global head of safety was simple "it's research is not a bombshell" and she repeated the line, "not a bombshell." this research is the very definition of a bombshell. facebook and big tech are facing a big tobacco moment, a moment
of reckoning. i see big 10 echo -- i saw big tobacco and helped lead the state and that legal action and i remember very, very well the moment in the course of our litigation when we learned of those files that showed not only that big tobacco new its product caused cancer, but that they had done the research and concealed files and now we knew and the world. big tech now faces that big tobacco, jaw-dropping moment of truth. it is documented proof that facebook knows its products can be addictive and toxic to children, and it's not just that they made money. again, it is that they valued their profit more than the pain that they caused to children and
their families. the damage to self interest and self-worth inflicted by facebook today will haunt a gentle ration -- a generation. feelings of adequacy, rejection, and self-hatred -- of inadequacy , rejection, and self-hatred will face our children for years to come. teenagers today looking at themselves in the mirror feel doubt and insecurity. mark zuckerberg ought to be looking at himself in the mirror today and get rather than taking responsibility and showing leadership, mr. zuckerberg is going sailing. his new modus operandi is no apologies, actions, nothing to
see here. mark zuckerberg, you need to come before this committee and explain to frances haugen, to us, to the world, and to the parents of america what you were doing and why you did it. his business model is straightforward, more eyeballs, more dollars. everything facebook does is to add more users and keep them on the apps for longer in order to hook us, it uses our private information to precisely target us with content and recommendations, affecting what will provoke a reaction will keep scrolling. far often they encourage the most disruptive and dangerous behaviors. as we showed on thursday, we created a fake account.
i office and i did as a team, interested in extreme dieting and eating disorders. instagram latched onto that teenager's initial insecurities and pushed more content and recommendations, glorifying eating disorders. that is how instagram's algorithms can push teens into darker and darker places. facebook's own researchers call it instagram's "perfect storm." it is exacerbating downward spirals. facebook has been powerfully maximizing profits. facebook's failure to act makes it morally bankrupt. again and again, facebook rejected their own research last week ms. davis said open what we
are looking at" no specific plans, no commitments, only platitudes. these documents you have revealed provided this company with a blueprint, specific recommendations, that could've made facebook and instagram safe. the company repeatedly ignored those recommendations from its own researchers that would have made facebook and instagram safer. facebook researchers have suggested changing their recommendations to stop voting account known to encourage dangerous body comparisons. instead of making meaningful changes, facebook pays lip service and if they won't act,
and if big tech won't act, congress has to intervene. privacy protection is long overdue. senator markey and i have introduced the kids at each would ban addictive tactics that facebook uses to exploit children. parents deserve better tools to protect children. i am a firm supporter of section 230, which should consider narrowing the sweeping immunity with algorithms and amplifying illegal conduct and perhaps you will expand on it. we have heard compelling recommendations about requiring disclosures of research and independent reviews of these platforms' algorithms, and a plan to pursue these ideas. the securities and exchange commission should investigate your intentions and claims, and
so should the federal trade commission. facebook appears to have misled the public and investors and if that's correct, it ought to base real penalties as a result of that misleading and deceptive misrepresentation. i want to thank all my colleagues here today, because what we have is a bipartisan congressional roadmap for reform that will safeguard and protect children from big tech. that will be a focus of our subcommittee moving forward and it will continue to be bipartisan. i will just end on this note -- in the past weeks and days, parents have contacted me with their stories, heartbreaking, and spine chilling stories about children pushed into eating disorders, bullying online, self
injury of the most disturbing kind, and sometimes even taking their lives because of social media. parents are holding facebook accountable because of your bravery and we need to hold accountable facebook and all big tech as well. again, my thanks to you. i'm going to enter into the record a letter from 52 states attorney general and to members of the youth advisory board of san diego, as long as there is no objections. i will now turn to the ranking member. >> thank you for entering that into the record from our states attorney general. good morning to everyone. it is nice to see people in this hearing room and to be here for the hearing today. we thank you for your appearance
before us today and forgiving the opportunity not only for congress but for the american people to hear from you in this setting. mr. chairman, thanks to you and your staff that have worked with our team to make certain that we have this hearing and this opportunity today so that we can get more insight into what facebook is actually doing as they invade the privacy not only of adults, but of children. and look at the ways that they are in violation of the children's online privacy protection act which is federal law, and looking at how they are evading that law and working around it. as the chairman said, privacy and online privacy, passing a
federal privacy standard has been long in the works. i put my first one in 2012 and i think it will be this congress and this subcommittee that will lead the way to online privacy, data security, section 230 reforms and of course senator klobuchar always wants to talk about antitrust. and senator markey is down there. when we were in the house, we were probably the only ones who were talking about the need to have a federal privacy standard. as the chairman mentioned, last week we heard from ms. davis, who heads global safety for facebook. and it was surprising to us that what she tried to do was to minimize the information that
was in these documents, to minimize the research and to minimize the knowledge that facebook had. at 1.i even reminded her the research was not third-party research, the research was there -- and i even reminded her the research was not third-party research, the research was facebook's research. they knew where the violations were and they know they are guilty. they know this. the research tells them this. last week, in advance of our hearing, facebook released two studies and said that the wall street journal was all wrong, they had just gotten it wrong, as if the wall street journal did not know how to read these documents and how to work
through this research. having seen the data that you've presented and the other studies that facebook did not publicly share, i feel pretty confident that it facebook who has done the misrepresenting. here are some of the numbers that facebook chose not to share. mr. chairman, i think it is important to look at these as we talk about the setting for this hearing, what we learned last week, what you and i have been learning over the past three years about tech and facebook. 66% of teen girls on instagram and 40% of teen boys experience negative social comparisons. this is facebook's research. 52% of teen girls who experience negative social comparison on instagram said it was caused by
images related to beauty, social comparison is worse on instagram, because it is received as real-life based on celebrity standards. social comparison mimics a grief cycle and a downward emotional cycle, it in companies the rate -- and company the range of most -- of messages. facebook calls it problematic use. here is what else we know -- facebook is not interested in making significant changes to improve kids' safety on their platforms, at least not when that would result in losing eyeballs on posts or decreasing their ad revenues. in fact, facebook is running
scared, as they know that in their own words, younger adults are less active and less engaged on facebook and that they are running out of teens to add to instagram. so teens are looking at other platforms like tiktok and facebook is only making the changes that add to the users numbers and ultimately its profits. follow the money. so what are these changes? allowing users to create multiple accounts that facebook does not delete and encouraging teens to create second accounts they can hide from their parents. they are also studying younger and younger tilden, as young as eight -- younger children, as young as eight, so they can market to them. ms. davis said kids below 13 are not allowed on facebook and
instagram, but we know that they are because she told us they recently had lee did 600,000 -- they had recently deleted 600,000 accounts from kids under age 13. how do you get that many under age accounts if you are turning a blind eye to them in the first place? and in order to clean it up, you had to delete it and you said, by the way, we just in the last month delete it 600,000 under age accounts. and speaking of turning a blind eye, facebook turns a blind eye to user privacy. news book yesterday, the private data of over 1.5 billion, that's right 1.5 billion facebook users is being sold on a hacking forum . that is its biggest data breach to date. examples like this thunderstorm i strong complaints about
facebook collecting the data of kids and teens and what they are doing with it. facebook turns a blind eye towards blatant human expectation taking place on its platform, trafficking, forced labor, cartels, the worst possible things one can imagine. big tech companies have gotten away with abusing consumers for too long. it is clear that this book prioritizes profits over the well-being of children and all users. so as a mother and a grandmother, this is an issue that is of particular concern to me. so we thank you for being here today ms. haugen, and we look forward to getting to the truth about what facebook is doing with users' data and how they are abusing their privacy and how they show a lack of respect
for the individuals that are on their network. we look forward to your testimony. sen. blumenthal: thank you senator blackburn. i don't know whether the ranking member would like to make a -- >> i will just take a moment or two and i appreciate being able to speak as ranking member. this is a subcommittee hearing. you see some vacant seats. there are also a lot of things going on so people will be coming and going. i am willing to predict this will have almost 100% attendance by members of the subcommittee because of the importance of the subject matter. thanks for coming forward to share concerns about facebook's business practices, with respect to children and teens and that
is the main topic of our hearing today, protecting kids online. the recent revelations about facebook's mental health effects on children and its plans to target younger audiences are indeed disturbing. i think you are going to see a lot of bipartisan concern about this today and in future hearings. they show how urgent it is for congress to act against powerful tech companies on behalf of children and the broader public. i say powerful tech companies, and they are possessive of in -- of immense power. the product is addictive and people on both sides are concerned about this.
i talked to an opinion maker just down the hall a few moments before this hearing, and this person said, the tech gods have been demystified now. i think this hearing today, mr. chair, is a part of the process of demystifying big tech. the children of america are hooked on their product. it is often destructive and harmful and there is knowledge on behalf of the leadership of these big tech companies that it is true. ms. haugen i hope you will have a chance to talk about your work at facebook and compare it to other social media companies. i look forward to hearing your thoughts on how this committee and this congress can show greater accountability and
transparency, especially with regard to children. thank you mr. chairman and thank you ms. haugen for being here. chrm. blumenthal: the witness today is frances haugen. she was the lead on the misinformation team. she holds a degree in electrical and computer engineering and an mba from harvard. she made the courageous decision, as all of us here and many around the world know, to leave facebook and reveal the terrible truth about the company she learned during her tenure there. i think we are all in agreement here and expressing our gratitude and admiration for your bravery coming forward. then q yo -- thank you, ms. haugen. please proceed.
ms. haugen: good afternoon chairman blumenthal, ranking number blackburn and members of the subcommittee. thank you for the opportunity to impair in front of you. my name is frances haugen. i used to work at facebook. i joined because i thought facebook had the potential to bring out the best of us. i'm here today because i think it harms children, stoke division and harm our michael c. they know how to make -- our michael c. -- our democracy. they put profits before people. congressional action is needed. they won't solve this without your help. yesterday we saw facebook get taken up the internet. i don't know why it went down, but for more than five hours facebook wasn't used to deepen divides, destabilize democracies and make young people feel bad
about their bodies. it also means millions of small businesses weren't able to reach potential customers and countless photos of new babies weren't celebrated by friends and family around the world. i believe in the potential facebook. we can have social media we enjoy, that connects us, without tearing apart our democracy, putting our children in danger, and sewing ethnic -- around the world. i have worked at large companies since 2006. my job is largely focused on algorithmic progress is and like the ones that power at the newsfeed. having worked on four different types of social networks, were -- understand how complex and nuanced his problems are. however, the choices being made inside of facebook are disastrous for our children, for
our public safety, for our privacy come in for our democracy, and that is why we must demand facebook make changes. during my time at facebook, first working as the lead product manager from disinformation and later uncover espionage, i saw facebook repeatedly encounter conflict between its own profits and our safety. facebook consistently resolved these conflicts in favor of their own profits. the result has been more division, more harm, or lies, threats, and combat. in some cases, this dangerous online talk has led to actual violence that harms and even kills people. this is not simply a matter of certain social media users being angry or about one side being radicalized against the other. it is about facebook choosing to grow at all costs, becoming animals $20 company by buying its profits -- $20 trillion
company by buying its profits over our safety. almost no one outside of facebook knows what happens inside of facebook. the company intentionally hides vital information from the public, the u.s. government, and governments around the world. documents i provided prove that it -- the efficacy of the artificial intelligence systems as well as spreading devices and extreme -- divisive and extreme ideas. the severity of this crisis demands that we break out of our previous regulatory frame. facebook wants to trick you into thinking that privacy protections or changes to section 230 alone will be sufficient. while important, they will not get to the core of the issue, which is that no one truly understands the destructive
traits of facebook except for facebook. we can afford nothing less than full transparency. as long as facebook is operating in the shadows, hiding it research from public scrutiny, it is unaccountable. until the incentives change, facebook will not change. left alone, facebook will continue to make choices that go against the common good, our common good. when we realized big tobacco was hiding the harms and because, the government took action. when we figured out cars were safer with seatbelts, the government took action. and when they learned a beer's were taking lives, the government took action. i employ you -- implore you to do the same. facebook shapes are perception of the world by choosing the information see. even those who don't use facebook are impacted by the majority who do. a company with such frightening influence over so many people, deepest lots, feelings,
behaviors, needs real oversight. but facebook's closed design means it has no real oversight, only facebook knows how it personalizes your feed for you. companies like google, any independent researcher can download from the internet the company research result and write papers about what they find, and they do. but facebook hides behind walls that keeps researchers and regulators from understanding the true dynamics of their system. facebook will tell you privacy means they can't give you data. this is not true. when tobacco companies claimed that filtered cigarettes were safer for consumers, scientists could independently and -- invalidate these marketing messages and confirm that in fact they posed a greater threat to human health. the public cannot do the same with facebook. we are given no other option then to take their marketing messages on blind faith.
not only does the company hide most of its own data, my disclosure has proved that when facebook is directly asked questions as important as, how do impact health and safety of our children, they mislead and misdirect. facebook has not earned our blind faith. this inability to see it systems and confirm how they work is like the department of transportation regulating cars by only watching them drive down the highway. today, no regulator has solutions for how to face -- fix facebook because facebook doesn't want us to know about what is causing the problems. otherwise there wouldn't have been a need for a whistleblower. how is the public supposed to assess if facebook is fixing the problems if the public has no
visibility into how they operate? this must change. facebook once you to believe the problems we are talking about are unsolvable. they want you to believe in false choices. they want you to believe must choose or lose were the most important values -- to be able to share photos of kids with old friends, you must also be inundated with anger driven virology. -- virality. i am here today that is not true. these problems are solvable. a safer, free-speech respecting, more enjoyable social media is possible, but there is one thing i hope everyone takes away from
these disclosures -- it is that facebook can change what is clearly not going to do so on its own. my fear is that without action, divisive and extremist behaviors we see today are only the beginning of what we saw in myanmar and now in ethiopia are the opening chapters of a story so terrifying no one wants to read the end of it. congress can change the rules that facebook plays by and stop the many harms it is now causing . we now know the truth about facebook's instructive impact -- destructive impact. i came forward at great personal risk because i believe we still have time to act, but we must act now. i am asking you to act. thank you. chrm. blumenthal: thank you for taking that personal risk could
we will do anything and everything to protect and stop any retaliation against you in any legal action the company may bring or anyone else. with made that very clear in the course of these proceedings. i want to ask you about this idea of exposure you've talked about, looking in effect and cause down the road and we will have five-minute rounds of questions and maybe a second round if you are willing to do it. we are here today to look under the hood. that is what we need to do more. in august, senator blackburn and i wrote to mark zuckerberg and asked him straightforward questions about how the company works for our children and teenagers on instagram. facebook dodged, sidetracked, in
effect misled us. so i am going to ask you a few straightforward questions to break down some of what you have said come and if you can answer them yes or no, that would be great. is facebook's own research ever found that its platforms can have a negative effect on children and teens' mental health or well-being? ms. haugen: many of facebook's internal research reports indicate that facebook has a serious negative harm and a significant portions of younger children. chrm. blumenthal: has facebook ever offered features that it knew had a negative effect on children and teens' mental health? ms. haugen: facebook knows that its amplification algorithms, can lead children from very innocuous topics like healthy
recipes, all the way from just something innocent like healthy recipes to anorexia promoting content over a very short period of time. chrm. blumenthal: has facebook ever found in its research that kids show signs of addiction on instagram? ms. haugen: facebook has studied a pattern they call problematic use, we might call addiction, it has a very high bar for what it believes. it says you self identify and don't have control over usage and it is harming yourself, your schoolwork or your physical health. 5% to 6% of 14-year-olds had the self-awareness to admit both those questions. it is likely that far more than 5% to 6% of 14-year-olds are addicted to instagram. chrm. blumenthal: last thursday, my colleagues and i asked ms. davis, who was representing
facebook, about how the decision was made whether to pause permanently instagram for kids, and she said "there's no one person who makes the a decision like that. we think about it collaboratively." it's as though she couldn't mention mark zuckerberg's name. isn't he the one who will be making this decision from your expanse in the company? ms. haugen: mark poses a very unique role in tech industry and that he holds over 55% of all the voting shares for facebook. there are no similarly powerful companies as unilaterally controlled. in the end, the buck stops with mark. no one is holding him accountable. chrm. blumenthal: and he is the algorithm designer in chief, correct q mark ms. haugen: i -- correct? ms. haugen: i received an mba
from harvard and they said we are responsible for the organizations we build. mark built one that is intended to be flat. the metrics make the decision. that itself is a decision and in the end if he is the ceo and chairman of facebook, he is responsible for those decisions. chrm. blumenthal: the buck stops with him? ms. haugen: the buck stops with him. chrm. blumenthal: speaking of the buck stopping, you have said that facebook should declare moral bankruptcy. i agree. its actions and its failure to acknowledge its responsibility indicate moral bankruptcy. ms. haugen: there is a cycle inside the company where facebook has struggled for a long time to recruit and obtain the number of employees it needs to tackle the large scope of projects it has chosen to take on. facebook is in a cycle where it
struggles to hire and that causes it to understand projects which causes scandals which makes it harder to hire. part of why facebook needs to come out and say, we did something wrong and made choices we regret is the only way you can move forward and heal facebook's first admitting the truth. that way we can have reconciliation and move forward by first honest and declaring moral bankruptcy. chrm. blumenthal: being honest at acknowledging facebook has caused more pain simply to make more money, and it has profited off spreading disinformation and misinformation and sowing hate. i think facebook's answers to facebook's destructive impact always seems to be, or facebook. we need more facebook, which means more pain and more money for facebook. would you agree? ms. haugen: i don't think at any
point facebook set out to make a destructive platform. i think it is a challenge that facebook is an organization where parts of the organization responsible for growing and expanding the organization are separate and not with the companies that focus on the harm the company is causing. integrity actions, projects that were hard-fought by the teams trying to keep a safe are undone by new growth projects that counteract those same remedies. i do think it is a thing of there are organizational problems that need oversight and facebook needs help in order to move forward to a more healthy place. chrm. blumenthal: and whether it is teens lead into suicidal thoughts or the genocide of ethnic minorities and meaner -- in myanmar and fanning the flames of division in our own
country or europe, are ultimately responsible for the immorality of the pain it has caused. ms. haugen: facebook needs to take responsibly for the consequences of its choices and willing to accept small trade-offs in profit. i think just that act of being able to admit it is a mixed bag that is important. what we saw last week is the example of the kind of behavior we need to support facebook growing out of and instead of just focusing on the good they do, admit responsibility. chrm. blumenthal: mark zuckerberg's new policy is no apology, no admissions, no acknowledgment, nothing to see here, we are going to deflect it and go sailing. i turned to the ranking member. chrm. cantwell: thank you for your testimony. -- >> i had asked her about
underage users and she made the comment, if we find an account of someone who is under 13, we remove them. in the last three months we removed 1600 accounts -- 600,000 accounts of 13-year-olds. it seems to me or is a problem if you have 600,000 accounts from children who are -- ought not to be there in the first place. what did mark zuckerberg know about the plans to bring kids on as new users and advertise to them? ms. haugen: there are reports within facebook that show analysis where they looked at who joined facebook and instagram and based on those analyses, facebook likes to say
children lie about their ages to get onto the platform. the reality is, enough kids tell the truth that you can work backwards to figure out what are approximately the real ages of anyone on the platform. when facebook does cohort analyses retrospectively and discovers things like 10% to 15% of 10 euros may on facebook or instagram. sen. blackburn: this is why the ceo of instagram would have replied when it was said to him, i have been on instagram since i was eight he said he didn't want to know that. it would be for this reason, correct? ms. haugen: a pattern of behavior that i saw at facebook was that often problems were understaffed that there was discouragement from having better detection systems.
my last team was on the counterespionage team. at any given time our team could only and only one third of the cases that we knew about. we knew that if we build even a basic detector we would have many more cases. chrm. blumenthal: let me ask you this -- so you look at the way they have the data, but they are choosing to keep that data and advertise from it, sell it to third parties. so what does facebook do? you have 600,000 accounts that ought not to be on there -- ms. haugen: many more. chrm. blumenthal: right, you delete the account but what happens to the data? do they keep it until those children go to age 13, because you are saying they can work
backward and figure out the true age of a user? so what do they do with it? do they delete it, do they store it, do they keep it? ms. haugen: my understanding of facebook's policies, and i did not work directly on that, is that when they delete account they delete all of the data. with regard to children under age on the platform, face book would do substantially more to detect those children and should have to disclose to congress those processes because there are lots of subtleties in those processes. chrm. blumenthal: staying with this underage kids and online privacy, i want you to tell me how facebook is able to do market research on these
children that are under age, because ms. davis didn't deny this last week. do they bring kids into focus groups with their parents? how do they get that information. she said they got permission from parents. is there a permission slip or a form that gets assigned and then how do they know which kids to target? ms. haugen: there is bunch to unpack there. most tech companies have the systems where they can analyze the data that is on their servers and most of the focus groups i read or that saw those were around messenger kids, which has children on it.
those focus groups appear to be children interacting in person. often companies use sourcing agencies that will go and identify people who meet certain demographic criteria or will reach out directly based on data on the platform. so the case in messenger kids, maybe you would want to study a child that was active and when that was less active. sen. blackburn: so these are children under age 13? ms. haugen: yes. sen. blackburn: and they know it. ms. haugen: for some of them, and icing they get permission. sen. blackburn: we are still waiting to get a copy of that parental consent form that would involve children. my time is expired. i will say my other questions for a second round. chrm. blumenthal: thank you. senator klobuchar? sen. klobuchar: thank you for shining a light on how facebook
time and time again has put profit over people. when their own research found that more than 13% of teen girls say that instagram made their thoughts worse, they proposed instagram for kids which has now been put on pause. when they found out the algorithms are fostering polarization, misinformation, and hate that they allowed 99% of their violent content to remain unchecked on their platform, including lead up to the january 6 insurrection, what do they do? they now, as we know, mark zuckerberg is going sailing and say no apologies. i think the time has come for action. i think you are the catalyst for that action. you have said privacy legislation is not enough. i completely agree with you. i think you know we have not done anything to update privacy laws, federal privacy laws,
nothing, zilch, in any major way. wife question mark because there are lobbyists -- why? because there are lobbyists all around the building hired by the tech companies. facebook and the other tech companies are throwing a bunch of money around this town and people are listening to them. we have done nothing significant but we are on a bipartisan basis working on an antitrust subcommittee to get something done on consolidation, to understand allows the dominant platform to control all this, let the bullies in the neighborhood, by out the companies that may have -- buy out the companies that compete with them. the safety had of facebook who testified last week, i ask her how they estimate the lifetime value of a user for kids who
start using products before 13. she evaded the question and said that is not the way we think about it. is that right, or is it your expense at facebook estimates and puts a value on how much money they get from users in general? is that a motivating force for them? ms. haugen: based on what i saw in terms of allocation of integrity funding, one of the things exposed was 87% of all misinformation spending is on english only 9% are english speakers. it seems that facebook invest more in users who make the more money, even though the danger may not be evenly distributed. sen. klobuchar: does it make sense having a younger person hooked on social media at a young age makes them more profitable over the long term? ms. haugen: facebook's internal documents talk about the importance of getting younger
users, tween's, on instagram, like instagram kids, because they need to have -- they know children bring their parents online and things like that and they understand the value of users for the long-term success. sen. klobuchar: they estimated to be doing dollars $.58 per user last quarter -- $51.58 per user last quarter and when i asked them how many of that was for under 18 and she wouldn't answer. do you think teens are profitable? ms. haugen: i would assume so, based on advertising or television. i am sure they are some of the more profitable users but i did not work directly on that. sen. klobuchar: another issue has been eating disorders. studies have found that eating disorders have the highest mortality rate of any mental illness for women. i led a bill on this that we
passed into law and i am concerned that the algorithms they have pushes outrageous content, promoting anorexia and the like. i know it is personal to you. do you think the algorithms push some of this content to young girls? ms. haugen: facebook knows the engagement banked -- the engagement base users amplifies preferences and they have done something called a proactive response, where they take things they have heard, for example things like, can you be led by the algorithms to anorexia content and they confirmed that yes, this happens to people. facebook knows they are leading young users to anorexia content. sen. klobuchar: do you think they are designing their products to be addictive? ms. haugen: facebook has a long
history of having a successful growth division where they take tiny tweaks and constantly and constantly are trying to optimize it to grow. those kinds could be construed as things that facilitate addiction. sen. klobuchar: we have seen the same kind of content in the political world you brought up other countries and what has been happening period 60 minutes, you said facebook implemented safeguards to reduce misinformation ahead of the 2020 election, but turned off those safeguards right after the election. you know that the insurrection happened january 6. do you think facebook turned off the safeguards because they were costing money because it was reducing profits? ms. haugen: facebook was emphasizing a false choice this had the safeguards in place before the election implicated free speech p the choices that were happening on the platform work about how reactive -- free speech. the choices that were happening
on the platform or about how reactive was it. in the runoff -- run up to the election, they knew they were dangerous. because they wanted that back after the election, they returned to the original defaults. the fact that they had to break the glass and turn them back on, i think that is deeply problematic. sen. klobuchar: i agree. thank you for your bravery in coming forward. chrm. blumenthal: -- >> i have been arguing for some time that it is time for congress to act. i think the question is, what is the correct weight to do it, the right weight to do it can -- the right way to do it consistent with our right to free speech. this committee doesn't have jurisdiction over the antitrust issue. i'm not averse to look into the monopolistic nature of facebook. that is a real issue that needs
to be examined and addressed. there are a couple of things i think we can do. i have a piece of legislation with cosponsors call the filter bubble transparency act. it would give users the options to engage with social media platforms without being manipulated by the secret formulas that essentially dictate the content that you see when you open up an app or log onto a website. we also need to hold big tech accountable by reforming section 230. one of the best opportunities to do that in a bipartisan way is the platform accountability consumer transparency act. that, in addition to stripping section 230 protections for content that is illegal, it would also increase transparency and due process for users
importantly, in the context we are talking about today with this hearing with a big tech whistleblower, the pack act would explore the viability of a program big tech employees to blow the whistle on wrongdoing inside companies where they work. we should encourage employees in the tech sector like you to speak up about questionable practices and big tech companies. we can ensure americans are sure of how social media platforms are using artificial intelligence to keep them hooked on platforms. let me just ask you, the information you provided that facebook conducts what is called engagement based ranking, which you have described as very dangerous. could you talk more about why engagement based ranking is dangerous and do you think congress should seek to pass legislation like the filter bubble transparency act that would give users the ability to avoid engagement based ranking
altogether? ms. haugen: facebook is going to say you don't want to give up engagement based ranking. you will not like facebook as much if we are not seeking out the content for you. that is just not true. facebook likes to present things as false choices. let's say we ordered by time like i message or other forms of social media that are chronologically based. you will get spammed, you will not enjoy your feed. the reality is those experiences have lots of permutations. there are ways where computers don't regulate what we see. we socially regulate what we see. they don't want to have that conversation. facebook knows when they take out their content that we focus on, we spend more time on their platform, they make more money. the dangers of engagement based
ranking are facebook knows content that elicits an extreme reaction is more likely to get a click, comment, or re-share. those aren't even necessarily for your benefit. they know other people will produce more content if they get the likes, comments, and re-shares. they post the content so you will get little hints of dopamine to your friends to create more content. they have run experience -- experiments on people where they have confirmed this. sen. thune: the information you provided to the wall street journal said facebook alternate algorithm to boost these msi. the algorithm rewarded more outraged and sensationalism. facebook would say it's algorithms are used to connect
individuals and friends and family that are largely positive. do you believe that facebook's algorithms make its platform a better place for most users? should consumers have the option to have the use of facebook and instagram without the manipulated by algorithms designed to keep them engaged? ms. haugen: i strongly believe -- i have spent most of my career on engagement based rankings. engagement based rankings, facebook says we could do it safely because we have ai. the artificial intelligence will find that content that we know our engagement based rankings is promoting. they know it is dangerous. the ai will say less. facebook's own research says they cannot adequately identify dangerous content. there are dangerous algorithms they admit are picking up the extreme sentiment position. they cannot protect us from the
harm they know exists in their own system. should people have the option of choosing to not be manipulated, i think if we had appropriate oversight or if we reformed 230 to make facebook responsible for the intentional ranking decisions -- i think they would get rid of the engagement based ranking. it is causing teenagers to be exposed to more anorexia content. it is pulling families apart. in places like ethiopia it is spanning violence. i encourage updates to these platforms. not picking and choosing individual ideas. making the ideas safer, less twitchy, less viral. that is how we solve these problems. sen. thune: i would simply say let's get to work here. chrm. blumenthal: i agree.
sen. schatz: thank you for your courage in coming forward. was there a moment when you came to the conclusion that reform from the inside was impossible and you decided to be a whistleblower? ms. haugen: there was a long series of moments where i became aware that facebook consistently chose to prioritize its profits over public safety. the moment that i realized we needed to get help from the outside, the only way these problems would be solved is by solving them together and not solving them alone was when integrity was fault -- dissolved following the election. it felt like a betrayal of the promises facebook had made. people who sacrificed a great deal to keep the election safe by dissolving our community and integrate other parts of the company. sen. schatz: i know the response
was they distributed the duty, that is an excuse, right? ms. haugen: i cannot see into the hearts of other men. sen. schatz: let me put it this way, it will work, right? ms. haugen: the people who i work with were maybe 75% of my pod -- product managers, most of them came from integrity. all of us left the inauthentic behavior pod for other parts of the company or the company entirely over the same period. six months after, we clearly lost faith that those changes were coming. sen. schatz: you said in your opening statement that they know how to make facebook and instagram safer. you are now the chief executive offer -- officer and chairman of the company, what germans would
-- what changes would you immediately institute? ms. haugen: i would establish a policy of how to share information and research from inside the company with appropriate oversight bodies. i would give proposed legislation to congress saying here's what appropriate oversight would look like. i would actively engage with academics to make sure that people who are confirming facebook's marketing message is true has the information they need. i would immediately implement the soft interventions to protect the 2020 election. things like requiring someone to click on the link before re-sharing it. companies like twitter have found that significantly reduces misinformation. sen. schatz: i want to pivot back to instagram's targeting of kids. we all know they announced a pause.
that reminds me of what they announced when they were going to issue a digital currency. they got beat out by the u.s. senate banking committee and said never mind. now they are coming back around hoping nobody notices that they are going to try to issue a currency. for the moment, the business model which appears to be gobbled up everything, do everything, that is the strategy. do you believe they are actually going to discontinue instagram kids or are they waiting for the dust to settle? ms. haugen: i would be sincerely surprised if they do not continue working on instagram kids. i would be amazed if we don't have this conversation again. sen. schatz: why? ms. haugen: facebook understands if they want to continue to grow they have to find users. they have to make sure that the next generation is just as engaged with instagram as the
current one. the way they will do that is by making sure children establish habits. sen. schatz: by hooking kids? ms. haugen: by hooking kids. i would like to emphasize one of the documents we sent in examine the rates of problematic use by age. that peaked with 14-year-olds. teenagers do not have the self-regulation. i feel bad when i use instagram and i cannot stop. we need to protect it. sen. schatz: i have a long list of mis-direction's and outright lies from the company. i don't of time to read them. you are as intimate with all of these deceptions as i am. i will jump to the end. if you are a member of this panel, would you believe what facebook is saying? ms. haugen: facebook has not
earned the right to still have blind trust. one of the most beautiful things i heard on the committee last week was trust is earned and facebook has not earned our trust. sen. schatz: thank you. chrm. blumenthal: thank you, senator moran and then we have joined by the chair. we will break at about 11:30 if that is ok because we have a vote. then we will reconvene. sen. -- >> as senator thune said, let's go to work. >> our differences are very minor in the face of revelations that we have now seen. i hope we could move forward. >> i share that view, thank you.
thank you for your testimony. what examples do you know? we talk about children, teenage girl specifically. one other examples do you know about what facebook or instagram new its would be harmful to accusers but still proceeded with the plan and executed those harmful behaviors. ms. haugen: facebook's research is aware that -- they know severe harm is happening to children. in the case of bullying, facebook knows that instagram dramatically changes the experience of high school. we were in high school -- when i was in high school. most kids could go home and
reset for 16 hours. kids who are bullied on instagram it follows them home into their bedroom. the last thing they see before they go to bed is someone being cruel to them or the first thing they see in the morning is someone being cruel to them. kids are learning that their own friends are cruel to them. click about how this could impact their domestics relationships. facebook knows because they give children bad advice. facebook's own research is aware that they are struggling with these things because they can't even get support from their own parents. i don't understand how facebook
could know all of these things and not escalate with help and support from all of these problems. sen. moran: are there other practices that are known to be harmed? ms. haugen: facebook is aware the choices it made in meaningful social interactions, engagement based rankings that did not care if you bullied someone, that was meaningful. they know that change directly changed publishers behavior. companies rode in and said the content is most successful on our platform. you have a problem with your ranking and they did nothing. politicians are forced, those are the ones that are distributed on facebook.
that is a huge negative impact. they have admitted in public that engagement based rankings have not rolled out those integrity and security systems of most of the languages in the world. that is causing things like ethnic balance in ethiopia. sen. moran: what is the magnitude of facebook's revenues or profits that come from the sale of user data? ms. haugen: we haven't worked on that, i am not aware. sen. moran: what regulations or legal actions do you think would have the most consequence or be feared most by facebook, instagram, or aligned companies. ms. haugen: i encourage reforming section 230, decisions about algorithms. modified 230 around content. it is very complicated because
you can generate content is something companies have less control over. they have 100% control over algorithm. facebook should not get a free pass on choices it makes with growth, virality, reactive nest over public safety. they should not get a free pass. they are paying for their profits right now with our safety. i also believe there needs to be a dedicated oversight body. right now, the only people in the world trade to analyze these experience are people who grew up inside of facebook or other social media companies. there needs to be a regulatory home where someone like me could do a tour of duty after working at a place like this. have a place to work on things like regulation to bring that out to oversight boards and have the right to do oversight. sen. moran: a regulatory agency
within the federal government? ms. haugen: yes. chrm. blumenthal: senator cantwell. sen. cantwell: thank you for holding the steering. i want to continue on that base. the privacy act that introduced along with several of my colleagues actually does have ftc oversight of algorithm transparency. in some instances i hope you look at that and tell us what other areas you think we should add to that level of transparency. clearly, that is the issue at hand here. coming forward, thank you for your willingness to do that. the documentation that you say exists gives the level of transparency about what is going on that people have not been able to see. your information that you say is going up to the highest levels at facebook is that day purposely new that their algorithms were continuing to
have misinformation and hate information. that when presented with information about this terminology for the downstream msi includes social information knowing that it was this choice to continue this wrong information, hate information about the rohingya or you could continue to get higher rates. i know you know that on a page if you click through the next page i'm pretty sure there's a lot more then if you didn't click through. you're saying the documents exist, the highest level at facebook, you have information discussing these choices and that people chose, even they that -- even though they knew it was misinformation, they continued to choose profit. ms. haugen: we continue -- we submitted documents to congress. mark zuckerberg was presented
with this peer to hard intervention is taking a user facebook. soft interventions are about making different choices to make it less viral, less twitchy. mark was presented with these options and chose to not remove downstream msi in april, 2020 even though it is isolated in at-risk countries. if it had any impact on the overall msi metric. so sen. cantwell: which means in turn less money? is there another reason why you would do it although it really would affect your numbers? ms. haugen: i don't know for certain. we struggled reading these minutes, how is this possible? we just read 100 pages on how dance -- downstream msi expands hate speech, graphic violent
content. why wouldn't you get rid of this? the best theory that we came up with, this was just our interpretation. people stay or leave the company based on what they get paid and if you hurt msi, a bunch of people were not going to get bonuses. sen. cantwell: i'm personally very frustrated because we presented information to facebook from one of my own constituents in 2018 talking about this issue with the rohingya, pleading with the company. we pleaded with the company and they continued to not address this issue. now you are pointing out these same algorithms are being used and they know darn well that it is causing and inciting violence. they are today choosing profit over this information. ms. haugen: in the summer of
last year they turned off downstream msi only when the content -- facebook's own algorithms are bad at finding content. 80% or 90% of even that sensitive content. in countries where they don't have integrity systems and in the case of ethiopia there is 100 million people and six languages. facebook only supports two of those languages. this strategy of focusing on language specific systems to save us is due to fail. sen. cantwell: i'm sending a letter to facebook. they better not delete any information about how they proceeded in light of your information. aren't we also now talking about advertising fraud? aren't you selling something to advertisers that is not really what they are getting?
we are trying to say journalism basically has to be a different standard, a public interest standard. these guys are a social media platform that doesn't have to live with that. the consequence is they are telling advertisers we see it. people are coming back to local journalism. we want to be with a trusted brand. i think you are finding is an interesting one. we have to look at what are the other issues? do they defraud advertisers? telling them they will be advertising it when in reality it was something different based on a model. ms. haugen: we have multiple examples of question and answers where the sales staff, advertisers say, should we come back to facebook after the insurrection? facebook said in their talking
points that they gave to advertisers, we are doing everything in our power. sen. cantwell: that was not true. thank you mr. chairman. chrm. blumenthal: if you want to make your letter available, i would be glad to join you myself. i thank you for suggesting it. senator lee. sen. lee: thank you for joining us this week. it is very helpful. last week we had another witness from facebook comment testify before this committee. she focused on the extent to which facebook targeted ads towards children. including ads that work sexually suggestive or guilt -- geared
towards adult themed products. while i appreciated her willingness, i did not get the clearest answers in response to some of those questions. i'm hoping you could shed some light on some of those issues related to facebook's advertising process. i want to point you to a quote that i got from ms. davis last week. here's what she said. when we do ads to young people there are only three things in advertiser could target around. age, gender, location. we prohibit certain ads to young people including weight loss ads. we don't allow tobacco ads at all. we don't allow them to children. since that exchange happened last week, there are a number of individuals including a group
called the transparency project that have indicated that the testimony was inaccurate. they noted it conducted an experiment just last month. the goal was to run a series of ads that would be targeted to children ages 13-17 to users in the united states. i want to emphasize that tdp did not wind up running these ads. they stop them from being distributed to the users. facebook approved them. facebook approves them for an audience up to 9.1 million users. this is the first one i wanted to showcase. this first one is a colorful graphic encouraging kids to throw a skittles party like no
other like the graphic indicates also independently suggests this involves getting together randomly to abuse prescription drugs. the second graphic discredit -- displays something specifically designed to encourage and promote anorexia. it is on there. the ad also promotes it. the third one invites them to find a partner online. you look lonely, find your partner now to make it love connection. this will be an entirely different kettle of fish if it
was an adult audience. it is targeted through -- 213-17-year-olds. particularly when targeted to children, just to be clear tdp does not end up pushing the ads out. this says something, one could argue approves facebook is allowing and perhaps facilitating ads to our nation's children. could you please explain to me how these ads with a target audience, how would they possibly be approved by facebook? is ai involved in that?
ms. haugen: i do not work directly on the system. facebook has a deep focus on scale. it is partly why they rely on ai so much. it is possible that none of those ads were seen by humans. from repeated documents within my disclosures that facebook's ai system only catches a minority of offending content. at most they will ever get 10%-20%. in the case of children, drug paraphernalia ads, it is likely if they rely on computers and not humans they will likely never get more than 10-20%. sen. lee: i have one follow-up
question. when facebook makes claims and only targets ads based on age, gender, and location even though these things seem to counteract that. let's set that aside for a minute. they are not facing ads based on specific interest categories. do they still collect data on teenagers even if they are not at that moment targeting ads at teens based on those interest categories? ms. haugen: it is important to differentiate what targeting they specify and what targeting facebook may learn. it will likely extract out features they would learn parting is a concept. i am very suspicious that personalized ads are still not being delivered because the
algorithm learns correlations. they learn interactions where your party add -- facebook has a ranking model in the background that says this person wants more party related content. sen. lee: thank you, that is very helpful. what that suggests to me is they are saying they are not targeting teens with those ads, the algorithm might do some of that work. might be why they collect the data. ms. haugen: i cannot say whether or not that is the intention. it is difficult to understand these algorithms today. these biases unintentionally learn. -- sen. lee: thank you. chrm. blumenthal: senator
martin. sen. markey: our nation owes you a huge debt of gratitude for the courage you are showing. so i thank you. would you say facebook seeks to attack children and teens on platform? -- market children and teens? ms. haugen: children definitely -- facebook markets to children under 18 on instagram. sen. markey: why do we care about tweens according to a memo, they are valuable but untapped. facebook only cares about
children to the extent that they are of monetary value. facebook's global head of safety, davis, told me facebook does not allow targeting of certain content to teens. it stated we don't allow weight loss ads to be shown to people under the age of 18. a recent study promoted this to children as young as 13 with a young woman's waist. based on your time at facebook, do you feel they are telling the truth? ms. haugen: they are for guest on -- they are focused on scale over safety. using artificial intelligence to identify harmful ads without allowing the oversight to see what is the actual effectiveness of these safety systems.
sen. markey: did you raise this issue with your supervisor? ms. haugen: i did not work on anything involving teen mental health. sen. markey: ms. davis testified last week that we don't allow tobacco ads at all. we don't allow them, we don't allow alcohol ads to minors. researchers also found facebook allows targeting for teens, do you feel facebook's truck -- telling the truth? ms. haugen: i do not have context on that issue. if they are using artificial intelligence, unquestionably it is making its way through. sen. markey: from my perspective listening to you and your courageous revelations, time and time again facebook says one thing and does another.
facebook fails to abide by the commitments that they have made. time and time again facebook lies about what they are doing. yesterday facebook had a platform outage. for years and it is at a principal outage. facebook is like big tobacco. enticing young kids with their first cigarettes, that first social media account designed to hooks kids as users for life. facebook uses harmful features that quantify popularity, push manipulative marketing. amplify harmful content to teens. facebook would not even commit--
-- give young teens under the age of 16 and their parents control of their information. they will ban targeted ads to children. ms. haugen: i support all of those actions. sen. markey: finally, we have introduced the algorithmic justice and transparency act that would open the hood on facebook and big tech's algorithms so we know how facebook is using our data to decide what content we see. two, a discriminatory algorithm
like showing vulnerable housing -- housing ads to white people but not black people in our country. should congress passed that bill? ms. haugen: algorithmic bias issues are a major issue for our democracy. i became very aware of the challenges of -- like i mentioned before it is difficult to understand how these algorithms actually act and perform. facebook is aware of complaints today. it does not give african-americans the same distribution as white people. until we have transparency and the ability to confirm ourselves if facebook's marketing messages are true, we will not have a system that is compatible. sen. markey: i think it senator lee -- thank senator lee. i agree with you writing facebook asking them to explain
the discrepancy. i think facebook is lying about targeting 13-15-year-olds. my message for mark zuckerberg, your time of invading our privacy, promoting toxic content, and preying on children and teens is over. congress will be taking action. you could work with us or not work with us. we will not allow your company to harm our children and our families and our democracies. thank you, we will act. chrm. blumenthal: thank you. we will turn to senator blackburn and then we will take a break. i know there is some interest in another round of questions. maybe we will turn to senator wuhan -- senator lujan.
>> mr. chairman, i have to go to sit in the chair starting at noon. >> i have one question. this relates to what mr. markey was asking. does facebook ever employ child psychologists or mental health professionals to deal with these children online issues that we are discussing? ms. haugen: facebook has many researchers with phd's. i assume some of them are -- i know some of them have psychology degrees. i am not sure if they are child specialist. facebook works with external agencies to protect children's rights online. chrm. blumenthal: senator lujan, after those questions we will take a break, come back at noon.
sen. lee: -- sen. lujan: during the hearing the company contested their own internal research that does not exist. yes or no, does facebook have internal research indicating that instagram harms teens? particularly harming perception of the body image that disproportionately affects young women? ms. haugen: facebook has extensive research. sen. lujan: thank you for confirming these reports. facebook made the data set minus any personal information available to this committee. do you believe it is important for transparency and safety that facebook released the internal
research? the core data set to allow independent analysis? ms. haugen: i believe it is important for democracy that they must be disclosed to the public on a regular basis. we have to have privacy and sensitive data sets to allow independent researchers to confirm whether or not facebook's marketing messages are true. sen. lujan: facebook may get internal primary research not just secondary slight data, the underlying data by default. could this be done in a way that respects user privacy? ms. haugen: i believe in collaboration with academics and other researchers we could develop ways of exposing radically more data than is available today. it is important to understand how facebook shapes the information we get to see. sen. lujan: is facebook capable of making the right decision on its own or is regulation needed
to create real time? ms. haugen: until incentives change, we cannot expect facebook to change we need action from congress. sen. lujan: i asked about shadow profiles and she said no data is ever collected on children under 13 because they are not allowed to make accounts. this ignores the issue. instead of seeing this as a problem to be solved, facebook views this as a business opportunity. does facebook conduct this on children under 13 examining the business opportunities to facebook profit? ms. haugen: facebook has to publish this, they are on the platform in far greater numbers than everyone is aware. i am aware facebook is doing research on children under the age of 13. those studies are included. sen. lujan: you have shared
concerns about how senior management at facebook has done this over the user harm and safety. i have a few questions on facebook's decision-making. last week i asked, has facebook found a change to its platform that could potentially inflict harm on users? facebook will move forward because the change will grow where users will increase revenue. it has not been my experience at all at facebook, that is just not how we would approach it. has facebook ever found a feature on its platform that harmed its users but the future move forward because it would grow users or increase revenue? ms. haugen: facebook likes to paint that these issues are complicated. there are a lot of simple issues. requiring someone to click through a link for urea share. it does -- in some countries
re-shares break up 35% of content all people see. sen. lujan: did these decisions ever come from mark zuckerberg directly or from other management at facebook? ms. haugen: we have a few choice documents. metrics defined by facebook with meaningful social interactions. sen. lujan: this is the reference you shared earlier to ms. cantwell in 2020. ms. haugen: the soft intervention. sen. lujan: facebook was able to count on the silence of its workforce for a long time. facebook moderators have called out a culture of fear and secrecy that prevented them from
speaking out. is there a culture around facebook with whistleblowing and accountability? ms. haugen: facebook has a culture that emphasizes that insularity is the path forward. it would just be misunderstood. i believe that relationship has to change. the only way we will solve these problems is by solving them together. we will have more democratic solutions if we do it collaboratively. sen. lujan: is there a senior-level executive like inspector general who's responsible for ensuring complaints from employees are taken seriously and that employees legal, ethical, and moral concerns received consideration with a real possibility of instigating change to company policies? ms. haugen: i'm not aware of that role. the company is large. sen. lujan: it is my understanding there is a gentleman by the name of roy
austen who is the vice president of civil rights who described himself as an inspector general. he does not have the authority to make these internal conflicts public. the oversight board was created by facebook. it was not created to raise concerns. another area i believe we have to act on. i thank you for coming forward today. ms. haugen: my pleasure, happy to serve. chrm. blumenthal: the committee is in recess.