pallone, for his five minutes of questions. >> thank you, chairwoman eshoo. i just want to ask each of you to focus on the biden administration's frs proposal. there are many that are opposed to classified scheduling of fentanyl-related substances in the proposal the administration has put forward to congress. basically what would each of you say to those who have expressed concerns about why this proposal deserves their support and what happens if the current emergency scheduling order for fentanyl-related substances expires? i would like each of you to spend a minute or so answering that question, if you will. i guess i'll start with mr. milione. >> thank you. at dea we're focused on going after those that are causing this terrible harm in the country and helping those that are harmed. fentanyl, without a question, is the most significant drug threat that we're facing. it is much better to be proactive in dealing with this. these drugs are deadly. they are temporarily scheduled. we need to get them permanently scheduled so that when they are -- we can stop
eshoo. >> what you have done is a great act of public service. your document relates to a project facebook researchers set up to observe the platform's recommendations. i only have five minutes, so maybe you can do this in a minute or whatever, but can you briefly tell us what the research found as it relates to facebook's algorithms leaving -- leading down rabbit holes of extremism -- leading users down rabbit holes of extremism. >> you can follow donald trump and milani, fox news, hillary and msnbc. and by clicking on the content facebook suggest, they spoke will get more extreme. on the left, you go within three weeks to let skilled republicans. on the right, within a couple days, you get to qanon, to white genocide in a couple weeks. there is only the facebook system amplifying and amplifying and it happens because that content is something you are most likely to engage with, even though after you are surveyed, you say you don't like it. representative: thank you. it is wonderful to see you again. thank you for testifying today. your testimon
eshoo for five minutes. >> thank you for your courage. what you've done is a great act of public service. in the document you disclosed, set up to observe the platform's recommendations -- may be you can do this in a minute and a quarter or whatever. can you briefly tell us what the research found, as it relates to facebook's algorithms? leading users down rabbit holes of extremism? >> facebook has found over and over again, on the right hand on the left, with children, that you can take a blank account with no friends, no interests, in you can follow centrist interest and follow donald trump and millennia, you can follow fox news, you can follow in the roa -- you can follow hilary, and just m clicking on the content facebook's adjusts to you, facebook will get more and more extreme. on the left, you go in three weeks to let skill republicans. it's crazy. on the right, within a couple of days, qanon, white genocide. there is a facebook system amplifying, and it happens because the content is the content you are most likely to engage with.
Fetching more results