Skip to main content

tv   Facebook Whistleblower Testifies on Protecting Kids Online  CSPAN  October 5, 2021 5:00pm-6:20pm EDT

5:00 pm
public service along with these of television providers, giving a front row seat to democracy. >> this morning, a senate subcommittee heard testimony from a facebook whistleblower about her experience at the social media company and how best to protect children online with privacy regulations stopped we begin with her opening statement. i joined facebook because i think facebook has the potential to bring out the best in us. i'm here today because i believe facebook products harm children,
5:01 pm
stoke division and weaken our democracy. the leadership knows how to make facebook and instagram safer but won't make the changes because they have put their astronomical profits before people. congressional action is needed. they won't solve this crisis without your help. yesterday, we saw facebook get taken off the internet. i don't know why it went down, but i know that for more than five hours, facebook was not used to deepen divides, destabilize democracies and make young girls and women feel bad about their bodies. it also means millions of small businesses were not able to reach potential customers and countless photos of new babies weren't joyously celebrated with families and friends around the world. i believe in the potential of facebook. we can have a social media we enjoyed that connects us without tearing our democracy apart, putting our children in danger, and sewing flick -- ethnic
5:02 pm
violence from the world. we can do better. i have worked as a product manager at large tech company since 2006, including google and facebook. my job has largely focused on algorithmic rocks like google plus search and recommendations like the ones that power facebook. having worked at four different types of social networks, i understand how complex and nuanced these problems are. however, the choices being made inside facebook are disastrous for our children, our public safety, for our privacy and our democracy. that is why we must demand facebook make changes. during my time at facebook, serving as a product manager for misinformation and counterespionage, i saw facebook repeatedly encounter conflict between its own profits and our safety. facebook complete --
5:03 pm
consistently resolve these conflicts in terms of its own profits. the result has been more division, more harm, more lies, more threats, and more combat. in some cases, this dangerous online talk has led to actual violence that has harmed and even killed people. this is not simply a matter of certain social media users being angry or unstable or one side being radicalized against the other. it is facebook choosing to grow at all costs, coming in almost trillion dollar company, buying its profits with our safety. during my time, i came to realize a devastating truth -- almost no one outside of facebook knows what happens inside of facebook. the company intentionally hides vital information from the public, from the u.s. government and governments around the world. the documents i've provided to congress proof facebook has heatedly misled the product about what it's known research
5:04 pm
feel -- reveals about safety of children and spreading divisive and extreme messages. i came forward because i believe every human being deserves the dignity of the truth. the severity of this crisis demands we break out of our previous regulatory frame. facebook wants to trick you into thinking privacy protections or changes to section 230 alone won't change. while important, it won't get to the core of the issue which is that no one understands the choices made by facebook except facebook. we can afford nothing less than full transparency. as long as facebook is operating in the shadows, hiding research from public scrutiny, it is unaccountable. until incentives change, facebook will not change. left alone, facebook will continue to make choices that go against the common good, our common good.
5:05 pm
when we realized big tobacco is hiding the harm it because, government took action. when we figured out cars were safer with seatbelts, the government took action. and when opioids were taking lives, the government took action. i implore you to do the same here. today's facebook shapes our perception of though world by choosing the information we see. even those who don't use facebook are impacted by those who do. facebook's closed designs mean it has no real oversight. only facebook knows how to personalize the feed for you. as other companies like google, any independent researcher can download from the internet that companies search results and write papers about what they find and they do.
5:06 pm
facebook hides behind walls that keeps resources and regulators from understanding the true system. they will tell you privacy means that it cannot give you data. when tobacco companies claimed filtered cigarettes were safer for consumers, scientists could independently invalidate these messages. and confirm they posed a greater threat to human health. the public cannot do the same with facebook. we have been given no other option than to take their message on blind faith. not only does that company hide most of its own data, my disclosure has proved when facebook is directly asked questions as important as how do you impact the health and safety of our children, they mislead and misdirect. facebook has not earned our blind faith. this inability to confirm how
5:07 pm
they work is communicated and -- it is like the department of transportation regulate in cars by only watching them drive down the highway. today, no regulator has a menu of solutions for how to fix facebook. facebook did not want them to know enough about how they caused the problems. otherwise there would be no needs for whistleblowers. if the public has no visibility and how facebook operates. this must change. facebook wants you to believe the problems we are talking about are unsolvable. they want you to believe in false choices. they want you to believe you must choose between a facebook full of devices and extreme content or losing the most important values our country was founded upon, free speech. that you must choose between
5:08 pm
public oversight of facebook choices and your personal privacy. that to share fun photos of your kids with old friends, you must be inundated with anger driven by reality. they want you to believe this is just part of the deal. i'm here today to tell you that is not true. these problems are solvable. a safer, free speech-respecting, more enjoying social media is possible, but there is one thing i hope everyone takes away from these disclosures -- it is that facebook can change, but it is clearly not going to do so on its own. my fear is that without action, divisive and extremist behaviors we see today are only the beginning. what we saw in myanmar and are now seeing in ethiopia are only the opening chapters of a story so terrifying no one wants to read the end of it. congress can change the rules
5:09 pm
facebook plays by and stop the many harms it is now causing. we now know the truth about facebook's instructive impact. i really appreciate the seriousness which the members of congress and securities and exchange commission are approaching these issues. i came forward at great personal risk because i believe we still have time to act. but we must act now and i'm asking you, our elected representatives to act. thank you. sen. blumenthal: thank you for taking that personal risk and we will do anything and everything to protect and stop any retaliation against you and any legal action the company may bring to bear or anyone else. we have made that very clear in the course of these proceedings. i want to ask about this idea of disclosure you have talked about.
5:10 pm
looking at a car going down the road and we are going to have five minute rounds of questions. maybe more if you are willing to do it. we are here today to look under the hood. that is what we need to do more. in august, senator blackburn and i wrote to mark zuckerberg and asked him straightforward questions about how the company works for our children and teenagers on instagram. facebook dodged, sidetracked, in effect misled us. so i am going to ask you a few straightforward questions to break down some of what you have said come and if you can answer them yes or no, that would be great. is facebook's own research ever found that its platforms can have a negative effect on children and teens' mental health or well-being? ms. haugen: many of facebook's
5:11 pm
internal research reports indicate that facebook has a serious negative harm and a significant portions of younger children. chrm. blumenthal: has facebook ever offered features that it knew had a negative effect on children and teens' mental health? ms. haugen: facebook knows that its amplification algorithms, can lead children from very innocuous topics like healthy recipes, all the way from just something innocent like healthy recipes to anorexia promoting content over a very short period of time. chrm. blumenthal: has facebook ever found in its research that kids show signs of addiction on instagram? ms. haugen: facebook has studied a pattern they call problematic use, we might call addiction, it has a very high bar for what it
5:12 pm
believes. it says you self identify and don't have control over usage and it is harming yourself, your schoolwork or your physical health. 5% to 6% of 14-year-olds had the self-awareness to admit both those questions. it is likely that far more than 5% to 6% of 14-year-olds are addicted to instagram. chrm. blumenthal: last thursday, my colleagues and i asked ms. davis, who was representing facebook, about how the decision was made whether to pause permanently instagram for kids, and she said "there's no one person who makes the a decision like that. we think about it collaboratively." it's as though she couldn't mention mark zuckerberg's name. isn't he the one who will be making this decision from your
5:13 pm
expanse in the company? ms. haugen: mark poses a very unique role in tech industry and that he holds over 55% of all the voting shares for facebook. there are no similarly powerful companies as unilaterally controlled. in the end, the buck stops with mark. no one is holding him accountable. chrm. blumenthal: and he is the algorithm designer in chief, ms. haugen: i -- correct? ms. haugen: i received an mba from harvard and they said we are responsible for the organizations we build. mark built one that is intended to be flat. the metrics make the decision. that itself is a decision and in the end if he is the ceo and chairman of facebook, he is responsible for those decisions. chrm. blumenthal: the buck stops with him? ms. haugen: the buck stops with
5:14 pm
him. chrm. blumenthal: speaking of the buck stopping, you have said that facebook should declare moral bankruptcy. i agree. its actions and its failure to acknowledge its responsibility indicate moral bankruptcy. ms. haugen: there is a cycle inside the company where facebook has struggled for a long time to recruit and obtain the number of employees it needs to tackle the large scope of projects it has chosen to take on. facebook is in a cycle where it struggles to hire and that causes it to understand projects -- understaffed projects, which causes scandals which makes it harder to hire. part of why facebook needs to come out and say, we did something wrong and made choices we regret is the only way you can move forward and heal facebook's first admitting the truth. that way we can have reconciliation and move forward by first honest and declaring moral bankruptcy.
5:15 pm
chrm. blumenthal: being honest at acknowledging facebook has caused more pain simply to make more money, and it has profited off spreading disinformation and misinformation and sowing hate. i think facebook's answers to facebook's destructive impact always seems to be, or facebook. we need more facebook, which means more pain and more money for facebook. would you agree? ms. haugen: i don't think at any point facebook set out to make a destructive platform. i think it is a challenge that facebook is an organization where parts of the organization responsible for growing and expanding the organization are separate and not with the companies that focus on the harm the company is causing. integrity actions, projects that were hard-fought by the teams trying to keep a safe are undone
5:16 pm
by new growth projects that counteract those same remedies. i do think it is a thing of there are organizational problems that need oversight and facebook needs help in order to move forward to a more healthy place. chrm. blumenthal: and whether it is teens lead into suicidal thoughts or the genocide of ethnic minorities and meaner -- in myanmar and fanning the flames of division in our own country or europe, are ultimately responsible for the immorality of the pain it has caused. ms. haugen: facebook needs to take responsibly for the consequences of its choices and willing to accept small trade-offs in profit. i think just that act of being able to admit it is a mixed bag that is important. what we saw last week is the example of the kind of behavior we need to support facebook
5:17 pm
growing out of and instead of just focusing on the good they do, admit responsibility. chrm. blumenthal: mark zuckerberg's new policy is no apology, no admissions, no acknowledgment, nothing to see here, we are going to deflect it and go sailing. i turned to the ranking member. chrm. cantwell: thank you for your testimony. i had asked her about underage users and she made the comment, if we find an account of someone who is under 13, we remove them. in the last three months we removed 1600 accounts -- 600,000 accounts of 13-year-olds. it seems to me or is a problem
5:18 pm
if you have 600,000 accounts from children who are -- ought not to be there in the first place. what did mark zuckerberg know about the plans to bring kids on as new users and advertise to them? ms. haugen: there are reports within facebook that show analysis where they looked at who joined facebook and instagram and based on those analyses, facebook likes to say children lie about their ages to get onto the platform. the reality is, enough kids tell the truth that you can work backwards to figure out what are approximately the real ages of anyone on the platform. when facebook does cohort analyses retrospectively and discovers things like 10% to 15%
5:19 pm
of 10-year-olds may on facebook or instagram. sen. blackburn: this is why the ceo of instagram would have replied when it was said to him, i have been on instagram since i was eight he said he didn't want to know that. it would be for this reason, correct? ms. haugen: a pattern of behavior that i saw at facebook was that often problems were understaffed that there was discouragement from having better detection systems. my last team was on the counterespionage team. at any given time our team could only and only one third of the cases that we knew about. we knew that if we build even a basic detector we would have many more cases. chrm. blumenthal: let me ask you this -- so you look at the way they have the data, but they are
5:20 pm
choosing to keep that data and advertise from it, sell it to third parties. so what does facebook do? you have 600,000 accounts that ought not to be on there -- ms. haugen: many more. chrm. blumenthal: right, you delete the account but what happens to the data? do they keep it until those children go to age 13, because you are saying they can work backward and figure out the true age of a user? so what do they do with it? do they delete it, do they store it, do they keep it? ms. haugen: my understanding of facebook's policies, and i did not work directly on that, is that when they delete account they delete all of the data. with regard to children under
5:21 pm
age on the platform, face book would do substantially more to detect those children and should have to disclose to congress those processes because there are lots of subtleties in those processes. sen. blackburn: staying with this underage kids and online privacy, i want you to tell me how facebook is able to do market research on these children that are under age, because ms. davis didn't deny this last week. do they bring kids into focus groups with their parents? how do they get that information. she said they got permission from parents. is there a permission slip or a
5:22 pm
that get signed and then how do they know which kids to target? ms. haugen: there is bunch to unpack there. most tech companies have the systems where they can analyze the data that is on their servers and most of the focus groups i read or that saw those were around messenger kids, which has children on it. those focus groups appear to be children interacting in person. often companies use sourcing agencies that will go and identify people who meet certain demographic criteria or will reach out directly based on data on the platform. so the case in messenger kids, maybe you would want to study a child that was active and when that was less active. sen. blackburn: so these are
5:23 pm
children under age 13? ms. haugen: yes. sen. blackburn: and they know it. ms. haugen: for some of them, and i is sitting they get permission. sen. blackburn: we are still waiting to get a copy of that parental consent form that would involve children. my time is expired. i will say my other questions for a second round. chrm. blumenthal: thank you. senator klobuchar? sen. klobuchar: thank you for shining a light on how facebook time and time again has put profit over people. when their own research found that more than 13% of teen girls say that instagram made their thoughts worse, they proposed instagram for kids which has now been put on pause. when they found out the algorithms are fostering polarization, misinformation, and hate that they allowed 99% of their violent content to
5:24 pm
remain unchecked on their platform, including lead up to the january 6 insurrection, what do they do? they now, as we know, mark zuckerberg is going sailing and say no apologies. i think the time has come for action. i think you are the catalyst for that action. you have said privacy legislation is not enough. i completely agree with you. i think you know we have not done anything to update privacy laws, federal privacy laws, nothing, zilch, in any major way. why? because there are lobbyists -- why? because there are lobbyists all around the building hired by the tech companies. facebook and the other tech companies are throwing a bunch of money around this town and people are listening to them. we have done nothing significant but we are on a bipartisan basis
5:25 pm
working on an antitrust subcommittee to get something done on consolidation, to understand allows the dominant platform to control all this, let the bullies in the neighborhood, by out the companies that may have -- buy out the companies that compete with them. the time for action is now. i will start with something i asked the facebook head of safety who testified last week, i ask her how they estimate the lifetime value of a user for kids who start using products before 13. she evaded the question and said that is not the way we think about it. is that right, or is it your expense at facebook estimates and puts a value on how much money they get from users in general? is that a motivating force for them? ms. haugen: based on what i saw in terms of allocation of
5:26 pm
integrity spending. one of the things exposed was 87% of all misinformation spending is on english only 9% are english speakers. it seems that facebook invest more in users who make the more money, even though the danger may not be evenly distributed. sen. klobuchar: does it make sense having a younger person hooked on social media at a young age makes them more profitable over the long term? ms. haugen: facebook's internal documents talk about the importance of getting younger users, tween's, on instagram, like instagram kids, because they need to have -- they know children bring their parents online and things like that and they understand the value of users for the long-term success. sen. klobuchar: they estimated $51.58 per
5:27 pm
user last quarter and when i asked them how many of that was for under 18 and she wouldn't answer. do you think teens are profitable? ms. haugen: i would assume so, based on advertising or television. you get higher rates for customers who don't have references or habits. i am sure they are some of the more profitable users but i did not work directly on that. sen. klobuchar: another issue has been eating disorders. studies have found that eating disorders have the highest mortality rate of any mental illness for women. i led a bill on this that we passed into law and i am concerned that the algorithms they have pushes outrageous content, promoting anorexia and the like. i know it is personal to you. do you think the algorithms push some of this content to young girls? ms. haugen: facebook knows the
5:28 pm
engagement-based ranking amplifies preferences and they have done something called a proactive response, where they take things they have heard, for example things like, can you be led by the algorithms to anorexia content and they confirmed that yes, this happens to people. facebook knows they are leading young users to anorexia content. sen. klobuchar: do you think they are designing their products to be addictive? ms. haugen: facebook has a long history of having a successful growth division where they take tiny tweaks and constantly and constantly are trying to optimize it to grow. those kinds could be construed as things that facilitate addiction. sen. klobuchar: we have seen the same kind of content in the political world you brought up other countries and what has been happening period 60 minutes, you said facebook implemented safeguards to reduce misinformation ahead of the 2020
5:29 pm
election, but turned off those safeguards right after the election. you know that the insurrection happened january 6. do you think facebook turned off the safeguards because they were costing money because it was reducing profits? ms. haugen: facebook was emphasizing a false choice this had the safeguards in place before the election implicated free-speech. the choices that were happening on the platform work about how reactive -- free speech. the choices that were happening on the platform or about how reactive was it. because they wanted the growth back and the acceleration of the platform back after he election, they return to their original default. the fact they had to break the glass and turn them back on is problematic. sen. klobuchar: i agree. thank you for your bravery in
5:30 pm
coming forward. sen. coons: -- set both food -- >> i have been arguing for some time that it is time for congress to act. i think the question is, what is the correct weight to do it, the right weight to do it can -- the right way to do it consistent with our right to free speech. there are a couple of things we can do. senators blackburn and blumenthal are both sponsors called the filter bubble transparency act. it would give users the options to engage with social media platforms without being manipulated by the secret formulas that dictate the
5:31 pm
content you see when you open up an app or log on to our website. we also need to hold big tech accountable by reforming section 230. one of the best opportunities to do that in a bipartisan way is the platform accountability and that is legislation i have cosponsored, which in addition to stripping section 230, it would also increase transparency around content moderation. importantly, we are talking about with a major big tech whistleblower, it would explore the viability of a program big tech employees to blow the whistle on wrongdoing inside companies where they work. we should encourage employees in the tech sector like you to speak up about questionable practices and big tech companies. we can ensure americans are sure of how social media platforms are using artificial
5:32 pm
intelligence to keep them hooked on platforms. let me just ask you, the information you provided that facebook conducts what is called engagement based ranking, which you have described as very dangerous. could you talk more about why engagement based ranking is dangerous and do you think congress should seek to pass legislation like the filter bubble transparency act that would give users the ability to avoid engagement based ranking altogether? ms. haugen: facebook is going to say you don't want to give up engagement based ranking. you will not like facebook as much if we are not seeking out the content for you. that is just not true. facebook likes to present things as false choices. let's say we ordered by time like i message or other forms of social media that are
5:33 pm
chronologically based. you will get spammed, you will not enjoy your feed. the reality is those experiences have lots of permutations. there are ways where computers don't regulate what we see. we socially regulate what we see. they don't want to have that conversation. facebook knows when they take out their content that we focus on, we spend more time on their platform, they make more money. the dangers of engagement based ranking are facebook knows content that elicits an extreme reaction is more likely to get a click, comment, or re-share. those aren't even necessarily for your benefit. they know other people will produce more content if they get the likes, comments, and re-shares. they post the content so you will get little hints of dopamine to your friends to create more content.
5:34 pm
they have run experience -- experiments on people where they have confirmed this. sen. thune: the information you provided to the wall street journal said facebook alternate algorithm to boost these msi. rather than strengthening bonds between families and friends the , algorithm rewarded more outraged and sensationalism. facebook would say it's algorithms are used to connect individuals and friends and family that are largely positive. do you believe that facebook's algorithms make its platform a better place for most users? should consumers have the option to have the use of facebook and instagram without the manipulated by algorithms designed to keep them engaged? ms. haugen: i strongly believe -- i have spent most of my career on engagement based rankings.
5:35 pm
when i say these things, i am basically damming 10 years of my own work. engagement based rankings, facebook says we could do it safely because we have ai. the artificial intelligence will find that content that we know our engagement based rankings is promoting. they have written blog posts on how they know it is dangerous. but the ai will save us. facebook's own research says they cannot adequately identify dangerous content. there are dangerous algorithms they admit are picking up the extreme sentiment position. they cannot protect us from the harm they know exists in their own system. i don't think it's a question of should people have the option of choosing to not be manipulated, i think if we had appropriate oversight or if we reformed 230 to make facebook responsible for for the consequences of their the intentional ranking decisions -- i think they would get rid of the engagement based ranking. it is causing teenagers to be exposed to more anorexia content.
5:36 pm
it is pulling families apart. in places like ethiopia it is spanning violence. -- fanning ethnic violence. i encourage updates to these platforms. not picking and choosing individual ideas. making the ideas safer, less twitchy, less viral. that is how we solve these problems. sen. thune: i would simply say let's get to work here. chrm. blumenthal: i agree. sen. schatz: thank you for your courage in coming forward. was there a moment when you came to the conclusion that reform from the inside was impossible and you decided to be a whistleblower? ms. haugen: there was a long series of moments where i became aware that facebook consistently chose to prioritize its profits
5:37 pm
over public safety. the moment that i realized we needed to get help from the outside, the only way these problems would be solved is by solving them together and not solving them alone was when when -- when civic integrity was dissolved following the election. it felt like a betrayal of the promises facebook had made. people who sacrificed a great deal to keep the election safe by dissolving our community and integrate other parts of the company. sen. schatz: i know the response was they distributed the duty, that is an excuse, right? ms. haugen: i cannot see into the hearts of other men. sen. schatz: let me put it this way, it will work, right? ms. haugen: the people who i work with were maybe 75% of my pod -- product managers, most of them came from integrity.
5:38 pm
all of us left the inauthentic behavior pod for other parts of the company or the company entirely over the same period. six months after, we clearly lost faith that those changes were coming. sen. schatz: you said in your opening statement that they know how to make facebook and instagram safer. thought experiment -- you are now the chief executive offer -- officer and chairman of the company, what germans would -- what changes would you immediately institute? ms. haugen: i would establish a policy of how to share information and research from inside the company with appropriate oversight bodies. i would give proposed legislation to congress saying here's what appropriate oversight would look like. i would actively engage with academics to make sure that people who are confirming facebook's marketing message is true has the information they
5:39 pm
need. i would immediately implement the soft interventions to protect the 2020 election. things like requiring someone to click on the link before re-sharing it. companies like twitter have found that significantly reduces misinformation. no one is censored by being forced to click on a link before sharing it. sen. schatz: i want to pivot back to instagram's targeting of kids. we all know they announced a pause. that reminds me of what they announced when they were going to issue a digital currency. they got beat out by the u.s. senate banking committee and said never mind. now they are coming back around hoping nobody notices that they are going to try to issue a currency. for the moment, the business model which appears to be gobbled up everything, do everything, that is the
5:40 pm
strategy. do you believe they are actually going to discontinue instagram kids or are they waiting for the dust to settle? ms. haugen: i would be sincerely surprised if they do not continue working on instagram kids. i would be amazed if we don't have this conversation again. sen. schatz: why? ms. haugen: facebook understands if they want to continue to grow they have to find users. they have to make sure that the next generation is just as engaged with instagram as the current one. the way they will do that is by making sure children establish habits. sen. schatz: by hooking kids? ms. haugen: by hooking kids. i would like to emphasize one of the documents we sent in examine the rates of problematic use by age. that peaked with 14-year-olds. it is just like cigarettes. teenagers do not have the self-regulation. i feel bad when i use instagram
5:41 pm
and i cannot stop. we need to protect it. -- we need to protect kids. sen. schatz: i have a long list of mis-direction's and outright lies from the company. i don't of time to read them. you are as intimate with all of these deceptions as i am. i will jump to the end. if you are a member of this panel, would you believe what facebook is saying? ms. haugen: facebook has not earned the right to still have blind trust. last week one of the most , beautiful things i heard on the committee last week was trust is earned and facebook has not earned our trust. sen. schatz: thank you. chrm. blumenthal: thank you, senator moran and then we have joined by the chair. we will break at about 11:30 if that is ok because we have a
5:42 pm
vote. then we will reconvene. >> as senator thune said, let's go to work. >> our differences are very minor in the face of revelations that we have now seen. i hope we could move forward. sen. moran: i share that view, thank you. thank you for your testimony. what examples do you know? we talk about children, teenage girl specifically. what other examples do you know about what facebook or instagram new its would be harmful to its users but still proceeded with the plan and executed those harmful behaviors. ms. haugen: facebook's research
5:43 pm
is aware that -- they know severe harm is happening to children. for example, in the case of bullying, facebook knows that instagram dramatically changes the experience of high school. we were in high school -- when i was in high school. sen. brown: you looked at me and changed your wording. ms. haugen: most kids could go home and reset for 16 hours. kids who are bullied on instagram it follows them home into their bedroom. the last thing they see before they go to bed is someone being cruel to them or the first thing they see in the morning is someone being cruel to them. kids are learning that their own friends are cruel to them.
5:44 pm
thank about how this could impact their domestics relationships. facebook knows because they give -- they never experienced this, they give children bad advice. they say things like write out you just stop using it. facebook's own research is aware that they are struggling with these things because they can't even get support from their own parents. i don't understand how facebook could know all of these things and not escalate with help and support from all of these problems. sen. moran: are there other practices that are known to be harmful but yet are pursued? ms. haugen: facebook is aware the choices it made in meaningful social interactions,
5:45 pm
engagement based rankings that did not care if you bullied someone, that was meaningful. they know that change directly changed publishers behavior. companies like buzzfeed wrote in and said the content is most successful on our platform. you have a problem with your ranking and they did nothing. politicians are forced, those are the ones that are distributed on facebook. that is a huge negative impact. they have admitted in public that engagement based rankings is dangerous without integrity systems but have not rolled out those integrity and security systems of most of the languages in the world. that is causing things like ethnic balance in ethiopia. sen. moran: what is the magnitude of facebook's revenues or profits that come from the sale of user data?
5:46 pm
ms. haugen: we haven't worked on that, i am not aware. sen. moran: what regulations or legal actions do you think would have the most consequence or be feared most by facebook, instagram, or aligned companies. ms. haugen: i encourage reforming section 230, decisions -- to exempt decisions about algorithms. modifying 230 around content is complicated because user generated content is something companies have less control over. they have 100% control over algorithm. facebook should not get a free pass on choices it makes with growth, virality, reactive nest over public safety. they should not get a free pass. they are paying for their profits right now with our safety. i also believe there needs to be a dedicated oversight body.
5:47 pm
right now, the only people in the world who are trained to analyze these experience are people who grew up inside of facebook or other social media companies. there needs to be a regulatory home where someone like me could do a tour of duty after working at a place like this. have a place to work on things like regulation to bring that out to oversight boards and have the right to do oversight. sen. moran: a regulatory agency within the federal government? ms. haugen: yes. chrm. blumenthal: senator cantwell. sen. cantwell: thank you for holding this hearing. my colleagues have brought up a lot of important issues. i want to continue on that base. the privacy act that introduced along with several of my colleagues actually does have ftc oversight of algorithm transparency. in some instances i hope you look at that and tell us what other areas you think we should add to that level of
5:48 pm
transparency. clearly, that is the issue at hand here. coming forward, thank you for your willingness to do that. the documentation that you say exists gives the level of transparency about what is going on that people have not been able to see. your information that you say is going up to the highest levels is that they purposefully knew their algorithms were continuing to have misinformation and hate information. that when presented with information about this terminology for the downstream msi includes social information knowing that it was this choice to continue this wrong information, hate information about the rohingya or you could continue to get higher rates. i know you said you don't know about profit, but on a page, if
5:49 pm
you click through to the next page, there's a lot more ad revenue than if you didn't click through. you're saying the documents exist, the highest level at facebook, you have information discussing these choices and that people chose, even they that -- even though they knew it was misinformation, they continued to choose profit. ms. haugen: we submitted documents to congress. mark zuckerberg was presented with this peer to hard intervention is taking a user facebook. soft interventions are about making different choices to make it less viral, less twitchy. mark was presented with these options and chose to not remove downstream msi in april, 2020 even though it is isolated in at-risk countries. countries at risk of violence. if it had any impact on the overall msi metric.
5:50 pm
sen. cantwell: which means in turn less money? is there another reason why you would do it although it really would affect your numbers? ms. haugen: i don't know for certain. jeff horwitz who reports for the wall street journal, we struggled with this. how is this possible? we just read 100 pages on how dance -- downstream msi expands hate speech, graphic violent content. why wouldn't you get rid of this? the best theory that we came up with, this was just our interpretation. people's bonuses are tied to msi. people stay or leave the company based on what they get paid and if you hurt msi, a bunch of people were not going to get bonuses. sen. cantwell: you are saying this continues today? i'm personally very frustrated
5:51 pm
because we presented information to facebook from one of my own constituents in 2018 talking about this issue with the rohingya, pleading with the company. we pleaded with the company and they continued to not address this issue. now you are pointing out these same algorithms are being used and they know darn well that it is causing and inciting violence. they are today choosing profit over this information. ms. haugen: when writing began in the summer of the united states in the summer of last year they turned off downstream msi only when the content -- facebook's own algorithms are bad at finding content. it is still in the raw form for 80% or 90% of even that sensitive content. in countries where they don't have integrity systems and in the case of ethiopia there is 100 million people and six languages. facebook only supports two of
5:52 pm
those languages. this strategy of focusing on language specific systems to save us is due to fail. sen. cantwell: i'm sending a letter to facebook. they better not delete any information about how they proceeded in light of your information. aren't we also now talking about advertising fraud? aren't you selling something to advertisers that is not really what they are getting? we know about this because of the newspaper issues. we are trying to say journalism basically has to be a different standard, a public interest standard. these guys are a social media platform that doesn't have to live with that. the consequence is they are telling advertisers we see it. people are coming back to local journalism. we want to be with a trusted brand.
5:53 pm
we don't want to be in your website. i think you are finding is an interesting one. we have to look at what are the other issues? do they defraud advertisers? telling them they will be advertising it when in reality it was something different based on a model. ms. haugen: we have multiple examples of question and answers where the sales staff, advertisers say, should we come back to facebook after the insurrection? facebook said in their talking points that they gave to advertisers, we are doing everything in our power. we take down all the hate speech when we find it. sen. cantwell: that was not true. thank you mr. chairman. chrm. blumenthal: if you want to make your letter available, i would be glad to join you myself. i thank you for suggesting it. senator lee.
5:54 pm
sen. lee: thank you for joining us this week. it is very helpful. we are grateful you are willing to make yourself available. last week we had another witness from facebook comment testify before this committee. she focused on the extent to which facebook targeted ads towards children. including ads that are sexually suggested or geared toward adult themed products. while i appreciated her willingness to be here, i did not get the clearest answers in response to some of those questions. i'm hoping you could shed some light on some of those issues related to facebook's advertising process. i want to read you a quote i got from ms. davis last week.
5:55 pm
here's what she said. when we do ads to young people there are only three things in -- that an advertiser can target around -- age, gender, location. we prohibit certain ads to young people including weight loss ads. we don't allow tobacco ads at all. we don't allow them to children. since that exchange happened last week, there are a number of individuals including a group called the technology transparency project that have indicated that the testimony was inaccurate. they noted it conducted an experiment just last month. the goal was to run a series of ads that would be targeted to children ages 13-17 to users in the united states.
5:56 pm
i want to emphasize that tdp did not wind up running these ads. they stop them from being distributed to the users. but facebook did in fact approve them. facebook approves them for an audience up to 9.1 million users. all of whom were teens. i brought a few of these to show you. this is the first one i wanted to showcase. this first one is a colorful graphic encouraging kids to throw a skittles party like no other. , -- like the graphic indicates, like the slang jargon independently suggests, this involves getting together randomly to abuse prescription drugs. the second graphic displays a tip to encourage and promote
5:57 pm
anorexia. it's on there. the language independently promotes that. the ad also promotes it. insofar was suggesting fees are images you ought to look at when you need motivation to be more anorexic. the third one invites them to find a partner online. to make a love connection. you look lonely, find your partner now to make it love connection. this will be an entirely different kettle of fish if it was an adult audience. it is not. it is targeted to 13-17-year-olds. i don't support and tdp does not support these messages particularly when targeted to , impressionable children. just to be clear tdp does not end up pushing the ads out. but it did in fact receive facebook's approval.
5:58 pm
this says something, one could argue approves facebook is -- it approves facebook is allowing and perhaps facilitating ads to our nation's children. could you please explain to me how these ads with a target audience of 13-17-year-old children how would they possibly , be approved by facebook? is ai involved in that? ms. haugen: i do not work directly on the ad approval system. what was resonant for me about your testimony is facebook has a deep focus on scale. the scale is can we do things cheaply for a huge number of people. it is partly why they rely on ai so much. it is possible that none of those ads were seen by humans. the reality is we've seen from repeated documents within my
5:59 pm
disclosures that facebook's ai system only catches a minority of offending content. best case scenario, in the case of something like hate speech at , most they will ever get 10%-20%. in the case of children, drug paraphernalia ads, it is likely if they rely on computers and not humans they will likely never get more than 10-20%. sen. lee: i have one follow-up question. it should be easy to answer. while facebook may claim italy targets ads based on age, gender, and location even though these things seem to counteract that. let's set that aside for a minute. they are not facing ads based on specific interest categories. does facebook still collect data on teenagers even if they are not at that moment targeting ads
6:00 pm
at teens based on those interest categories? ms. haugen: it is important to differentiate what targeting they specify and what targeting facebook may learn. let's imagine you had some text on an ad. it will likely extract out features they would learn partying as a concept. i am very suspicious that personalized ads are still not being delivered because the algorithm learns correlations. they learn interactions where your party ad may go to kids interested in partying because facebook has a ranking model in the background that says this person wants more party related content. sen. lee: thank you, that is very helpful.
6:01 pm
what that suggests to me is they are saying they are not targeting teens with those ads, the algorithm might do some of that work. might be why they collect the data. ms. haugen: i cannot say whether or not that is the intention. it is difficult to understand these algorithms today. these biases unintentionally learn. it is hard to disentangle these factors as long as you have disengagement based rankings. -- this engagement base rankings. you are a 21st-century american hero warning our country of the danger for young people, our democracy and our nation owes you a huge debt of gratitude for the courage you are showing. so i thank you.
6:02 pm
would you agree facebook actively seeks to address children and teens on its platform. ms. haugen: children definitely -- facebook markets to children under 18 on instagram.an internt reads what do we care about between's, we are -- they are un-valuable. facebook only cares about children to the extent that they are of monetary value. last week, the goebel head of safety at facebook told me that facebook does not allow targeting of certain harmful content to teens. ms. davis stated we don't allow weight loss ads to be shown to people under the age of 18, yet a recent study found that
6:03 pm
facebook committed targeting of teens as young as 13 with ads that showed a woman's thin waist and pictures the glorified anorexia. do you think facebook is telling the truth? >> i think they spoke has focused on scale over safety. it is likely they are using artificial intelligence to try to identify -- without allowing the public to see what are the effects of the safety systems. >> you unearthed safety systems with regards to teens. did you raise that with the supervisors? >> i do not work directly on anything involved in teen mental health. this research is available to anyone in the company. >> someone testified that we don't allow tobacco ads. we don't allow alcohol ads to minors. however, researchers recently
6:04 pm
showed that facebook allows targeting of teens with ads on vaping. based on your time at facebook, do you think facebook is telling the truth? >> i do not. i have context on that. i assume that if they are using artificial intelligence to capture those ads, the ads are making their way through. >> from my perspective, listening to you and your courageous revelations time and time again, facebook says one thing and does another. time and time again, facebook fails to abide by the commitments they have made. facebook lies about what they are doing. yesterday, facebook had a platform outage. for years, it is had a principles outage. it's only real principle is profit. facebook's platforms are not safe for young people. facebook is like big tobacco, enticing young kids with the
6:05 pm
first cigarette. the first social media account design to hook kids as users for life. your whistleblowing shows the harmful features that quantify popularity, push manipulative influencer marketing, amplify harmful content to teens. last week, in this committee, facebook wouldn't even commit to not using these features on 10-year-olds. facebook is built on computer codes of misconduct. senator blumenthal and i have introduced the kids internet design and safety act. the kids act. you have asked us to act as the committee. facebook has scores of lobbyists in the city right now coming in after this hearing to tell us we can't act and they have been successful for a decade. in blocking this committee from acting.
6:06 pm
let me ask you a question. the kids internet design and safety act, here's with the legislation does. it includes outright bans on children's app features that quantify popularity with likes and follower counts. promotes influencer marketing. and amplifies a toxic post and it would prohibit facebook from using its algorithms to promote toxic posts. should we pass that legislation? >> i strongly encourage reforms that push not toward social media. the algorithms push not towards our friends and family.
6:07 pm
>> so you agree that congress has to act on these provisions. to stop using its algorithm to harm kids. >> i do believe congress must act to protect children. >> children and teens needed privacy on the author of the online protection act of 1998. it is only for kids under 13 because the industry stopped me from making it age 16 in 1998. because it was already their business model. when each update that for the 21st century. tell me if this should pass. >> to create a button so young people can tell it to delete the data about them. to give young teens under the age of 16 and their parents control of your information. and to ban targeted ads to
6:08 pm
children. >> i support all those actions. >> finally, i have introduced the algorithmic justice and online platform transparency act which would open the hood on facebook algorithms so we know how facebook is using our data to decide what content we see. and to ban discriminatory algorithms that harm vulnerable populations online like showing employment and housing at two white people, but not to black people. should congress passed that bill? >> algorithmic bias issues are major for our democracy. during my time, i became very aware of the challenges of luck i mentioned, it is difficult for us to understand how those
6:09 pm
algorithms act and perform. facebook is aware of complaints by alfred and americans saying that -- african-americans that it doesn't. -- doesn't give african-americans the same distribution as white people. until we have transparency that the messages are true, we will not have a system that is compatible with democracy. >> i thank senator lee. i wrote facebook asking them to explain that discrepancy because facebook i think is lying about targeting that age group. here is my message for mark zuckerberg. your time of promoting toxic content, preying on children and teens is over. congress will be taking action. you can work with us or not work with us. we will not allow your company to harm our children and our
6:10 pm
families and our democracy any longer. thank you. we will act. >> thank you. we are going to turn to senator blackburn. then, we will take a break. i know there is interest in another round questions. >> we have senator cruz and senator scott. >> we will come back after the ash >> i have to go to sit in the chair starting at noon. >> why don't we turn -- >> i have one question. this relates to what mr. markey was asking. this facebook ever employed
6:11 pm
child psychologists or mental health professionals to deal with these children online issues that we are discussing? >> facebook has many researchers with phd's. i know that some have psychology degrees. i'm not sure if there child specialists. facebook was a works with specialists that are specialists with tilden's rights online. >> then at the conclusion, we will take a break. at noon. >> thank you and i appreciate the indulgence of the committee. last week, the committee heard directly from ms. davis the global head of safety for facebook. during the hearing, the company contested their own internal research as it does not exist. yes or no, does facebook have internal research indicating
6:12 pm
that instagram harms teens? particularly harming perceptions of body image which disproportionately affects young women? >> yes. >> thank you for confirming. last week, i requested that facebook made the basis of this research, any personally identifiable information available to this committee. do you believe it is important for transparency and safety that facebook release the basis of this internal research to allow for independent analysis? parks i believe it is vitally important for our democracy that we establish mechanisms where facebook's internal research must be regularly disclosed and we need to have privacy sensitive data sets that allow independent researchers to confirm whether or not facebook marketing messages are actually true. >> be on this, should facebook make internal research not just
6:13 pm
slide decks but the underlying data public by default? can this be done in a way that respects user privacy customer parks i believe in collaboration of academics and other researchers that we can develop privacy conscious ways of exposing radically more dead than what is available today. it's important for us to understand how algorithms work, how they spoke gets to shape the information we see that we have this information probably available for scrutiny. >>'s regulation needed to create real transparency of facebook? >> until -- we need action from congress. >> last week, i asked ms. davis about shadow articles for children and she answered that no data is ever collected on children under 13 because they are not allowed to make accounts. this ignores the issue. facebook knows children use their platform. instead of seeing this as a
6:14 pm
problem to be solved, facebook views it as a business opportunity. this facebook conduct research on children under 13 examining the business opportunities of connecting these young children facebook's products? parks i want to -- >> you have shared your concerns about how senior management of facebook has continuously prioritized revenue over user harm and safety. i have a few questions on facebook's decision-making. last week, i asked ms. davis has facebook ever found a change to his platform would potentially inflict harm on users? facebook moved forward because the change would grow revenue or users.
6:15 pm
she responds it has not been my experience at all and facebook. that is just not how we approached it. yes or no, has facebook ever found a feature harmed its users but the future move forward because it would grow users or increase revenue? >> facebook likes to paint that these issues are complicated. there are a lot of simple issues. requiring someone to click through a link before you re-share. it does -- in some countries re-shares break up 35% of content all people see. >> did these decisions ever come from mark zuckerberg directly or from other management at facebook? >> we have a few choice documents. metrics defined by facebook with meaningful social
6:16 pm
interactions. >> this is the reference you shared earlier to ms. cantwell in 2020. >> the soft intervention. >> facebook was able to count on the silence of its workforce for a long time. facebook moderators have called out a culture of fear and secrecy that prevented
6:17 pm
them from speaking out. is there a culture around facebook with whistleblowing and accountability? >> facebook has a culture that emphasizes that insularity is the path forward. it would just be misunderstood. i believe that relationship has to change. the only way we will solve these problems is by solving them together. we will have more democratic solutions if we do it collaboratively. >> is there a senior-level executive like inspector general who's responsible for ensuring complaints from employees are taken seriously and that employees legal, ethical, and moral concerns received consideration with a real possibility of instigating change to company policies? >> i'm not aware of that role. the company is large. >> it is my understanding there is a gentleman by the name of roy austen who is the vice president of civil rights who described himself as an inspector general. he does not
6:18 pm
have the authority to make these internal conflicts public. the oversight board was created by facebook. it was not created to raise concerns. another area i believe we have to act on. i thank you for coming forward today. >> my pleasure, happy to serve. chrm. blumenthal: the committee is in recess. [inaudible conversations] >> you can watch the entire hearing about the potential
6:19 pm
harmful effects facebook and instagram could have on children based on facebook's own data coming up tonight at 8:00 eastern on c-span. it is also available on our new video app. >> c-span is your unfiltered view of government. >> the world changed in an instant. internet tracking sword and we never slowed down. schools and businesses slow down and we powered a new reality. we are power to keep you ahead. >> media come promotes c-span as a public service. >> president biden traveled to michigan to speak about the bipartisan infrastructure bill and his build back better

25 Views

info Stream Only

Uploaded by TV Archive on