Skip to main content

tv   Hearing on Police Use of Facial Recognition Technology  CSPAN  July 17, 2021 3:01pm-6:29pm EDT

3:01 pm
it have become increasingly complex. >> he oversaw the cybersecurity during the trump administration. he discusses recent ransomware attack's and other cyber attacks. today on c-span. a house judiciary subcommittee held a hearing on the pros and cons of police using facial recognition technology. witnesses included a former member of president obama's 21st century policing task force. unofficial -- an official with the government task office. this is three and a half hours. s 3.5 hours.
3:02 pm
>> welcome to an important hearing. it matched against a document that has been an important document, the constitution of the united states of america. facial recognition technology, proponents exposed the potential benefits of marginalized policing. safer borders, and ultimately a more efficient criminal justice system. at the same time, facial recognition system has clear potential, found to be inaccurate. certain groups of people. technology, and how it has been used in prosecution raises
3:03 pm
constitutional concerns that calls into question the tenants of fairness and due process. the criminal consequences implicated by these elements of facial recognition technology warrant examination closer examination by this body. congress has found itself at a disadvantage. the information on how law enforcement agencies have adopted facial recognition technology remains underreported or nonexistent. today, we are beginning facial recognition technology out of the shadows and into the light. this dialogue must start with a simple question about whether this technology is sufficiently accurate to justify its use by the police and if the risk of depriving someone of their liberty unjustly is too great.
3:04 pm
to add untested facial recognition technology to our policing would only serve to exacerbate the issue still plaguing urge terminal justice system. large-scale adoption of this technology would inject further in equity into a system at a time when we should be moving to make the criminal justice system more equitable for americans and we are working very hard to pass the george floyd policing act. it was passed out of this committee many months ago. there are many unknowns but we can be certain that most if not all facial recognition systems are less accurate for people of color and women and for the most part, we can be confident the darker the skin tone, the error -- the higher the error rate. error rates have been up to 34% higher for darker skinned women than lighter skinned men. it is not just sometimes wrong,
3:05 pm
it can be wrong up to a third of the time. additionally, a 2019 study by the national institute of standards and technology found empirical evidence that most facial recognition algorithms exhibit demographic differentials that can worsen their accuracy based on a person's age, gender or race. according to the study, systems vary widely in their accuracy and depending on the algorithms and types of search. america is a multicultural, multiethnic nation. this is particular important as it relates to these findings. asian and african-american people were up to 100 times more likely to be misidentified than white men. native americans had the highest false positive rate of all
3:06 pm
ethnicities and women were more likely to be falsely identified than men. the elderly and children were likely to be misidentified then owes -- then those in other age groups and it plays a strong role in trying to find missing persons. middle-aged white men generally benefited from the highest accuracy rate. in spite of these flaws, facial recognition technology is being rolled out across the country and has proven valuable to law enforcement in many instances from the large to the small, facial recognition systems are being quietly incorporated into american policing. you have small departments in the united states that may not have the training and are using these technologies. we have heard the success stories like it being used to identify the shooter and the tragic capital gazette massacre and help identify victims of sex
3:07 pm
trafficking by max just by maxing -- by matching missing persons photos. facial recognition technology has been used to identify several of the insurrectionists on january 6. while we must applaud the use of new technology for the apprehension of domestic terrorist, this must be weighed against the fact that many of the system privately or government owned operate with minimal transparency, limited oversight and faulty informational security. 18,000 different police of departments across america of varying capability, it is noted that they are using technology with minimal transparency and limited oversight and questionable security. individuals are testing this private software without the knowledge of -- or approval of their agencies. many communities find facial recognition -- systems and in their neighborhood. it must involve community
3:08 pm
consideration, technological evaluation and sober legal analysis. as these trends have developed, the federal government has been largely absent. what we don't know powers what -- towers over what we do and there needs to be change. little thought has been put into the widespread adoption of a technology that can materially offer a benefit to law enforcement. this is the issue this committee hopes to correct today and going forward. this hearing is not meant to prejudge facial recognition technology or its value to law enforcement but rather it is an important first step to proactively engage with this technology and adopt and adapt our laws accordingly. as i indicated, it is important that we also work on productive and construction legislation is -- constructive legislation as needed. i look forward to engaging with the witnesses about the benefits and pick walls inherent in facial recognition technology and forging a way forward toward
3:09 pm
transparency and accountability in its use. it is now my pleasure to recognize the ranking member of the committee, the subcommittee, the gentle man from arizona for his opening statement. >> i thank you for yielding time to me and i thank you for holding this very important hearing on facial recognition technology and its use by law enforcement. there is so much of what you just said that i can associate myself with step there are some things that i think we can work together on a bipartisan basis hopefully, all of the members of the united states congress. facial recognition technology is a powerful tool when used properly. when used improperly and relied upon too heavily, it raises serious concerns and has some real-world negative consequences. today, we will hear from a
3:10 pm
witness who was detained by the detroit police department. facial recognition technology is not perfect and honestly it's far from it. in order to be reliable, images must be captured in the most favorable conditions. facial recognition algorithms perform poorly in the faces of women, people of color, elderly and children. i am also concerned about the potential of first and fourth admin meant corrosion. law enforcement agencies could potentially use the system for the surveillance of individuals not involved in any suspicious activity whatsoever. moreover, federal law enforcement agencies with
3:11 pm
assistance from the state and local law oarsman partners have access to images of an overwhelming number of otherwise law-abiding citizens. state motor vehicle agencies possess high quality photographs of most citizens that can be used in facial recognition programs and it could be per -- combined with public surveillance. federal law enforcement agencies are also not limited to facial recognition technology systems they control are those controlled by other partners at the state and local level. federal law enforcement can use nongovernment facial recognition to providers like mission solutions. law enforcement officers with a clear view ai account can upload a photo of an unknown individual. the system return search results and shows the potential photos of the unknown individual as well as links to the fight -- to the site where the photos were obtained. the use of the technology is largely unregulated and raises
3:12 pm
numerous concerns. the government could use facial recognition technology to monitor or target people exercising their first amendment rights including freedom of association. facial recognition technology poses privacy concerns and improper use can violate the first and fourth amendment rights. with these constitutional concerns with this technology, the gao issued a report that said not only had the law enforcement agencies not assessed privacy and other risks but some of them don't know which systems their agents are using. i think this is a real opportunity for us to gain some important information and knowledge today. i have enormous concerns that i will summarize. technology, the technology here is problematic and inconsistent.
3:13 pm
it has not been perfected. images of law-abiding people have been acquired in the security of the databases is an issue after we've seen with recent breaches of databases. private companies acquiring and scraping their information and selling it to the feds, manipulation can take place. it expands the surveillance state we have been railing against. the gao gave important concerns. those -- that's a brief summary of some of these things that i'm concerned with. i think we can work to find common ground. if we are talking about nationalizing police forces, i don't think we will get there but if we are finding some kind of meaningful regulation and oversight of the implementation of facial recognition technology which is what the chair is alluding to, then i think we can find a lot of common ground here.
3:14 pm
with that, i would like to submit for the record a letter to the committee dated july 12, 2021 from matthew feeney of the cato institute. >> you want to finish the explanation? >> i also wish to submit an article from american military news from july of 2021 regarding the capitol police adoption of the army surveillance gear to identify americans and recognize threats. i also wish to include several articles about the alaskan couple who through facial technology that was a rooney us -- was eroneously used. without objection, so ordered. i thank the ranking member of the subcommittee for his presentation and his opening
3:15 pm
statement and i do believe, as i indicated, we should be engaged in oversight and legislative response and i would like to find common ground with our colleagues. i think this is important in terms of law enforcement in new utilization thereof but also for the constitutional principles in which we all believe. it's now my pleasure to record highs the distinguished chairman of the full committee, the gentleman from new york, chairman nadler for his opening statement. >> thank you for convening this hearing. facial recognition technology is now able to capture images and analyze it in ways that would have been impossible only a decade ago. in some cases, the technology
3:16 pm
enables the real-time surveillance and of individuals in a crowd. it has proliferated in a manner largely unchecked by congress. there is attention we must address. this technology is now a commonplace picture in our lives. on the other hand, most americans have little understanding of how the police use facial recognition technology to conduct surveillance of communities across the country for better or for worse. thanks to the work of the government accountability office, we have our first glimpse into federal law enforcement on facial recognition technology. it's recent study, they found 20 of the 42 federal law enforcement agencies surveyed you some form of facial
3:17 pm
-- use some form of facial recognition technology. this report leaves many unanswered questions about how the technology is used, when it is used and what type of mechanisms are in place to guard against abuse or protect sensitive data. for example, from the 14 federal agencies, only one agency could provide a complete information about what systems their employees are using in the field. we have received several disturbing reports of the misapplication of facial recognition technology by local police departments. when attempting to match women and people of color. in some cases, facial recognition is deployed without proper consideration of due
3:18 pm
process rights. one witness today will testify to the miss under the circumstances and what effect it can have an individual's life. he deserved -- he deserves better from a law enforcement agencies entrusted to use a technology we all know is less accurate and applied to citizens that look like him. facial recognition technology can be in important crime-fighting tool. like all such tools, we need to ensure that the adoption of this new technology does not further erode trust in law enforcement and the communities they serve and does not compound the unfairness of a criminal justice system that too often impacts people of color disproportionately. the subcommittee will ask the critical question -- the hearing will not answer every question about the adoption of facial recognition technology by law oarsmen. -- enforcement. my hope is that we will take another important step to balancing the needs of them -- of law enforcement agencies in the priority of keeping
3:19 pm
americans safe and crime and the concerns about this technology expressed by communities across this country. our task is not an easy one. we want in america were every individual knows they and their family are safe from harm while their fundamental rights are protected. we want in america were law enforcement has the tools it needs to protect and serve but with clear guidelines and rules that prevent those rules from being abused. understanding how to make that real in a moment where technology keeps changing and evolving will take a tremendous amount of work but this is worth doing. i thank the chairwoman for calling this hearing. i look forward to hearing from our witnesses, especially
3:20 pm
bertram lee who once worked in my office. it's good to see you and i look forward to your testimony. i thank all of the witnesses were taking the time to engage with us on this important topic i look forward to the testimony and i yield back the balance of my time. >> i thank the chairman very much for his presentation. now it's my pleasure to recognize the distinguished ranking member, the gentleman from ohio, mr. jordan for his opening statement. >> i look forward to days -- to today's hearing on facial recognition technology and i hope we can approach this issue in a truly bipartisan manner. last congress, we work closely with their colleagues on this issue. we had many vigorous to braids -- debates, facial technology was one issue where we shared common ground. we held three by partisan hearings in the oversight committee on this technology. we learned many important facts about facial recognition technology. there are serious first
3:21 pm
amendment and fourth amendment concerns. there are also due process concerns about how law enforcement uses the technology. we learned that over 20 states, department of motor vehicles, have handed over their drivers license dated to the. no american citizen gave their consent to have their data and images turned over, no elected official voted to allow this to happen. every american should be troubled by this, i know i am i hope my colleagues on this committee are as well. the gao issued a new report about how federal law enforcement uses frt. this makes clear the federal law enforcement agencies have not even assessed the risk when using this technology. if they haven't assess the risk, they are not taking them into consideration when the using the technology. the gao survey federal law enforcement on their use of the
3:22 pm
technology and 14 agencies that reported using the technology support criminal investigations also reported using systems owned by nonfederal entities. only one agency had awareness of what nonfederal systems are used. not only are federal law enforcement agencies not accepting the risk of using facial recognition technology, most don't know which systems their employees are using. it's difficult for a federal agent to assess using a system if they did not know which system they are using. i look forward to learning more and listening to our witnesses. i yield back. >> i thank the gentle man his opening statement stop without objection, all other opening statements of our members will be included in the record. introduce today's witnesses let me thank them for their participation.
3:23 pm
this is a virtual hearing that is no less crucial and important and we thank you for participating in that manner of technology. the first witness gret good whena leads the geo -- the gao work. it includes federal law enforcement and oversight and training, civil liberties and civil rights, vulnerable populations, the federal prison system in the federal judiciary. director goodwin has a phd and masters degree in economics. glad to see there is a houston connection. professor barry freeman served as a faculty director for the leasing project that new york university school of law stuff he is a professor of law and if his -- an affiliated professor of politics.
3:24 pm
he has litigated and written about constitutional law, the federal courts, policing and criminal procedure for over 30 years and has produced numerous books -- numerous books and articles. he graduated with honors from the university of chicago and received his law degree from georgetown university law center, welcome. robert williams lives in farmington hills, michigan with his wife and two young daughters. he works for a logistics planner in the automotive industry. he was wrongfully arrested by the detroit police department in 2020 when facial recognition technology misidentified him. he was apologize to them was grateful -- and we are grateful to the testimony you present
3:25 pm
today. virgil lee is a counsel for media and technology. he works to advance the interest of multiuse communities. his portfolio includes broadband access, surveillance technology. also artificial intelligence. he previously worked as policy council at public knowledge in the u.s. committee for education and labor. he received his jp from howard university's will of law. welcome. we also have a research fellow for the center of technology policy at the heritage foundation, her research post -- focuses on big tech and policy. she was a fellow at the center for new american security and served as a senior intelligence analyst for a u.s. naval command and helped create facebook's
3:26 pm
global security and counterterrorism program. she received her m&a from king's college london. welcome. professor jennifer lawrence served as a professor at the university of texas school of lower she studied and writes about law and institutional design shaping the function of criminal justice institutions. she currently reports to the american bar association on their standards tax force and is a former chair of the capital punishment assessment team. she received her undergraduate degree in politics and her jd from columbia law school, welcome to you. rhett coleman is the founder of the tolman group and previously served as a shareholder at a law
3:27 pm
firm. previously, he also served as a u.s. attorney for the district of utah and was selected by the attorney general to serve as special advisor to the attorney general on national/international policy issues affecting united states attorneys and the department of justice. he also served as legal counsel to the united states senate judiciary committee. he received his ba in jd from brigham young university. welcome. dr. cedric alexander is a law enforcement expert with over 40 years of experience in public safety. he has appeared on national media networks to provide comments on police net -- police communities and has written numerous editorials and books on the subject. previously served as deputy commissioner of u.s. state division and chief of police in dekalb county george and his
3:28 pm
-- county, georgia and his national president of the national organization of black law enforcement executives, powerful law enforcement organization. he was appointed to serve with president obama on the task force on 20th-century policing, received a ba from st. thomas university and a doctorate from wright state university, welcome. usually, i stand when i provide the oath but we welcome our distinguished witnesses and thank them for their participation today. i will begin by swearing in our witnesses. if you would please turn on your audio and make sure i can see your face and your raise right hand why i administer the oath. please unmute and let me see your raise right hand.
3:29 pm
do you swear or affirm that your testimony you are about to give today to be true and correct to the best of your knowledge, information and belief, so help gods? -- help you god? >> i do. >> thank you so very much. with the record show the witnesses have answered in the affirmative, thank you very much. each of your written statements will be introduced into the record in its entirety. i ask that you summarize your testimony in five minutes. there is a timer in the zoom view that should be visible on your screen step director goodwin, you may begin. welcome. you are recognized for five minutes. >> thank you. i am pleased to be here today to discuss several law enforcement view of facial recognition technology. use of this technology has
3:30 pm
expanded in recent years and questions exist regarding the accuracy of the tech algae, the transparency in its usage and the protection of ribas he and civil liberties where the technology is used. today, i will discuss the ownership and use of facial recognition technology by federal governments. -- federal agencies that employ law enforcement officers. also the types of activities we need the technology to support and tracking use of the technology. we surveyed 42 federal agencies that included law enforcement officers. 20 of them reported that they use a different facial technology system. 12 only used another entity's and five agencies own the system and used another entity. agencies noted that some systems can include billions of photos.
3:31 pm
federal agencies reported various activities such as criminal investigation, surveillance and managing business operations during the covid-19 pandemic. of the 20 agencies, 14 reported using it to support criminal investigation. that included investigations of violent crimes, and identity fraud. looking closely at these 14 agencies, some of them told us they use the technology last summer to support criminal investigations related to civil unrest, riots or protests following the killing of mr. george floyd. in addition, a few of these agencies told us they used the technology on people involved in the u.s. capitol attack to
3:32 pm
generate leads for criminal investigation. of these 14 agencies, we found that her team did not have complete, up-to-date information on what nonfederal systems are being used by their employees. agencies have not fully assess the potential risk related to privacy and accuracy of using these systems. to be clear, some agencies new employees were using certain systems. agencies may have had formal agreements or contracts with private companies. however, many agencies did not answer our questions and many agencies said they did not use the technology and later changed their answers. with these 13 agencies have in common is that don't have a process to track what federal systems employees are using. the accuracy of facial recognition technology has been reduced dramatically, the search
3:33 pm
will it -- produce inaccurate results. if it's not sufficiently accurate, it can unnecessarily identify people. -- innocent people as investigative leads. it could also miss investigative leads that otherwise had been reviewed. the technology could result in adverse results for individuals. we recommended the agency implement a mechanism to track what nonfederal systems for facial recognition technology are used by employees. and accept the risk of using the system including accuracy-related risk. facial recognition technology capabilities are only going to get stronger as law enforcement agencies do not know if and how their employees are using the technology. they cannot ensure that the
3:34 pm
appropriate texans are in place. this concludes my remarks. i am happy to answer any questions you may have. >> thank you your testimony and i want to recognize mr. barry friedman for five minutes. >> i want to thank you for giving me the opportunity to testify today. i am the founding director of the policing project at the new york university school of law. our mission relates directly to what you are doing today. we partner with allstate holders and work with communities and work closely with law enforcement agencies. we have the goal of bringing democratic accountability, transparency and equity, and i
3:35 pm
want to stress to parts of that. the issue of democratic accountability -- it's our position that one of my key points is the deep technologies did not be used without proper democratic accountability. i commend the subcommittee for having this hearing to pursue something that's essential. we do work with all stakeholders. i was incredibly gratified to listen to all the members who spoke to the need and the ability to work in a bipartisan way on this issue. i believe there are strong passionate feelings about these technologies but there is a way forward in the question is how to get there. the approach that we use for the policing project particular for emerging technologies is one that has a cost-benefit analysis.
3:36 pm
you look at the benefits and many of you said, there are deftly benefits for paying for this technology. many who have indicated their very serious cost and potential harms. there are racial harms from disparity, privacy harm, giving too much power to government. there are concerns about the first amendment. some people look at cost-benefit analysis and we weigh whether to do something or not. that's exactly the wrong approach. the value of the cost benefit analysis is we can find a nuanced way to regulate they could maximize the benefit and minimize or mitigate harm. i have spoken to many people over the last month and years about this issue. there are strong views on both sides including people who think the technology should be banned. when i talk to law enforcement or when i am talking to several
3:37 pm
racial justice advocates or people in the technology companies themselves, even though they disagree about the use of the technology, there is a wide swath of agreement that it isn't ok about dangerous and shoddy practices. i want to dive into the nuance of those practices. issues of accuracy and bias, we heard a lot about that and i think it's time and i hope we stop saying the technology is 81% accurate. we should talk about false negatives and false positives. we don't know anything about the accuracy of this technology. that is because law oarsmen use -- enforcement uses it in a way
3:38 pm
different than it has been tested. law enforcement uses different probe images to testify. law-enforcement enforcement uses much larger databases. the errors increase as the database gets larger most a lot of people say not to worry, there is a human in the loop but i would appeal to your common sense. that can be a good thing or a bad thing. humans in the loop would be a good thing if an independent individual who had no idea what the algorithm suggested identified the same person. humans are problematic because they are just confirming what the algorithm told them.
3:39 pm
all staff we need testing and means of ensuring that if there is a human interaction, that human needs to work to make sure we increase the accuracy. i want to urge you to think seriously about regulating the vendors themselves. those are the folks who know how the technology works. they could work on whether acceptable photos were the right settings and make sure we get accurate results. i'm going to thank you for listening to my testimony. >> thank you very much professor freedman. i'm recognizing robert williams. we are the preside are's over -- presidors over law enforcement in this nation and certainly, we welcome your testimony. thank you. i recognize you for five minutes. >> ok. thank you for the opportunity to testify today. january, 2020, i got a call from my wife and she told me that some detectives called her and
3:40 pm
told me to turn myself in. she says they will be calling me surely because they didn't have my phone number. they called me and i answered and i asked them what it was about and he told me he couldn't tell me. all he said was i need to turn myself in. i assumed it was a crank call until i hung up and call my wife and told my wife to disregard. they thought maybe i was trying to hide something. i told her to call our local police department and she did and they said they had no warrants or anything and disregard the call. i proceeded to drive home and when i got home, there was a detroit police car parked on my street and when i pulled into the driveway, they blocked me in
3:41 pm
as if i was going to make a run for it. i pulled up and he said, are you robert williams? i said yes i am, he said, you're under arrest. i said you can't just arrest me for that, what am i under arrest for? he said don't worry about it and proceeded to put the handcuffs on me. i told my wife and my kids were coming out of the house don't worry about it, they are making a mistake, i will be right back stop unfortunately, i was not right back step i had to spend 30 hours in jail. i kept asking while i was being arrested and they said we can't tell you that. they showed me the warrant and i said how did you get them?
3:42 pm
they said the head detective got it. we were talking about things they had done and things they got away with. i was wondering why i was there and nobody ever came to tell me. the next day, when i went to court, i pleaded not guilty. they took a recess and some detectives asked me some questions. they had me sign a paper saying that i will waive my right to counsel so that i could show me what was on the paper.
3:43 pm
he showed me a picture and he asked when i was at the store the last time and i said a couple of years ago. he says that you? i said that's not me. and he said i guess that's not you either. i said i hope you don't think all black people look alike step -- alike. he turns over another paper and said the computer guy got it wrong. i said i guess so, and i am -- am i free to go. he said i'm sorry but we are not detectives on your case but we cannot let you go. they came back from ourselves. i received a bond and got out. my wife contacted the aclu and we started getting information about the case and he realized that it was a wrong facial recognition policy.
3:44 pm
i've been fighting for it ever sense. i don't think it's right that my picture was used in some type of line up and i never have been in trouble. i want to say thank you for the opportunity to share my testimony and i hope that congress does something about this law and i'm happy to take any questions you might have. >> a powerful statement, mr. williams and your tracking what i begin with as i open this hearing that not only should we engage in oversight but i think we should engage in the right kind of fix and i am very apologetic to you and your family for having to go through this. our next witness is mr. bertram lee junior, you are recognized for five minutes. >> thank you for the opportunity
3:45 pm
to testify today. thank you for calling this hearing and shining a light on an undeniable threat. it poses a threat to black and brown people. we are at a turning point and must reject policies and practices rooted in discrimination toward public safety that respects the humanity, dignity and human rights of all people. leadership conference has spoken out against law enforcement use of facial recognition since 2016, highlighting the inherent bias. last month, the leadership conference along with upturn and new america's open technologies released a statement signed by 40 at cacique organizations --
3:46 pm
by 40 advocacy groups including its use exactly -- exact -- exacerbates the harm police due in communities and threatens individual and community policy. it violates due process rights and otherwise infringes a on -- upon procedural justice, often relies on phase prints that have been obtained without consent and in addition to racial bias, the technology itself poses a disproportionate risk for black, asian and indigenous people. the possibility for misidentification poses a tremendous threat to the health, safety and well-being of communities. the reason for these problems vary. in some cases, the cause of the bias was within the database where an images search and others due to the historical bias built into the algorithm itself stop the outcome is the same, black people are disproportionally recognized -- underrepresented.
3:47 pm
it entrenches racial disparities in the criminal system. in a comparison of mattress by country of origin, photos of people from east african countries had false positive rates 100 times higher than the baseline rate. researchers found facial analysis algorithms misclassified black women nearly 35% of the time while getting it right on white men. even if the accuracy was improved, the fundamental issue remains -- facial recognition technology dangerously expands the copan -- scope and power of law enforcement combined with different neighborhoods. algorithms could enable governments to track the public movement, habits and association of all people at all times with the push of a button stop -- button. according to a
3:48 pm
report from the georgetown center on privacy and technology, more than 133 american adults are included in facial recognition that was across the country and at least one in four state departments can run facial and recognition software through their systems. as the technology rapidly emerges in everyday police act committee, safeguards to ensure the technology is used fairly and responsibly are virtually nonexistent. as congress considers our nation of policing and punishment, it must accelerate law enforcement ability to wield this powerful technology especially when the history provides little room think it would be used. we're safer when our communities are healthy and thriving. policies of mass criminalization and over policing are fueled by white supremacy. it's about social control, not public safety. the leadership congress asks congress to act now by having a moratorium on law enforcement use of facial recognition tools.
3:49 pm
families and communities will endure unspeakable harmon tragedy on our governments watch as long as we allow these policies to continue. time is of the essence and the leadership conference asked forward to working with the subcommittee to address serious concerns of technology and work toward a new vision of justice that truly keeps communities safe. thank you. >> thank you for your testimony. and as well for that insight constitutional analysis which is important. i would like to recognize miss carol frederick. >> thank you for the opportunity to testify today. in four years, a projected 6 billion people will have access
3:50 pm
to the internet with 80 billion devices connected to the web. this creates profound digital surveillance opportunities. facial recognition as a standard capability. some uses of facial recognition comprised of legitimate safety concerns, finding the capital gazette shooter and detecting individuals without passports. the potential misuse is also high. risks are manifold. like in accurate algorithms -- inaccurate algorithms. i want to focus on three risks in particular. the circumspection of several liberties -- of civil liberties. today, the bureau has asked that over 640 million photos in some cases from private companies. also outsourcing that service to -- surveillance to unaccountable companies. they have a history of reckless privacy practices.
3:51 pm
government entities have long used private firms to identify patterns and useful information, a renewed push to make use of outside expertise for domestic spying to counter domestic extremism gives a concern. such impulses to outsource require law enforcement which leads to the third risk. the potential integration of facial integration data with other pii's for the capability of mass surveillance. the technical capabilities are already here area multiple data sources can be aggregated to allow governments to look through patterns of citizen behavior. lower latency provides quick transmission to increase data flows. more compute power along with data that extract data fit together in mutually reinforcing ways. this can engender a climate of
3:52 pm
fear, self-censorship and the chilling of free speech and the right to peaceably assemble in public spaces. thorium power enters -- authoritarian powers like china are on the edge of using facial recognition, the expanding of this power renders the slope slippery. once these powers expand, they almost never contract. the trendlines lines are foreboding. municipalities are using covid for extended surveillance. georgia became the first to use smart cameras. combined with near historical low levels of public trust, the unfettered deployment of these strategies will strain the health of the body politic with
3:53 pm
our immediate and genuine safeguard stop technology has -- safeguard. technology has long outpaced her -- our efforts to govern it. congress needs a potential data framework with potential -- appropriate standards of oversight. he needs to show how data is stored and shared by federal, state and local entities . there should be clear policies and data retention and categorize biometric data, limit and operability to stymie mass surveillance. congress should impair that u.s. -- ensure that u.s. identity management systems are secure and reliable based on proper measures and standards. programmers should build in data privacy predictions in the design days, testing new methods of encryption or differential privacy. the debate over public safety and private trade-offs in our republic provide an opportunity this provides an opportunity for
3:54 pm
us to add technology with privacy protection from the outset and shore up the system of checks and balances to address privacy infringements that occur. how we decide to use and safeguard these technologies either draws us closer to the public -- republic we should be or the authoritarian states we profess to have poor. i look forward to your questions -- profess to abhor. i look forward to your questions. >> thank you for your testimony. minutes. >> thank you very much for the opportunity to address the subcommittee on the topic of facial recognition technology. i will speak to a small sliver of the vast and complex topic that intersects with my own research on forensic science in
3:55 pm
criminal science and adjudication. the three bottom one points are these. first, although facial recognition technology matches have not been held is -- admissible in criminal cases, law enforcement uses forms the basis of criminal convictions and incarceration at the federal and state level. second, while facial recognition has the capacity to be highly accurate under ideal conditions, there is the greatest cause of concern for accuracy when used in criminal investigations. third, the criminal, legal system is not well designed -- well designed to weed out groups . facial recognition technology leads to deprivations of liberty. courts do not currently admit testimony or other evidence to establish of the perpetrator -- at the person is the perpetrator of the crime. however, evidence of facial
3:56 pm
recognition matches has been relied on courts in cases that are not governed by rules of disability like an bail hearing's. and in sentencing to find facts that enhance a defendant sentence. more critically, the overwhelming majority of mental -- criminal prosecutions begin with no presentation of evidence before a jury at all. i refer to the 98% of terminal -- criminal convictions obtained are guilty. in those cases and giving the design of the low evidence share threshold of primary cause, facial recommended matches can generate a criminal conviction without any courtroom testing of the evidence. moreover, even when facial recognition is used by law enforcement, it can only generate an initial lead in that match has the potential, given the stickiness of cognitive biases in the real potential for facial recognition matches to influence subsequent identification. the capacity to send investigators down and erroneous
3:57 pm
path that has enormous consequences for a wrongly accused individual like mr. williams. it would be of little concern if there were no reason to question its accuracy as a means of identifying suspects, but as others have noted today, despite significant advances, significant objections remain about facial recognition technology. as has already been discussed, the fact that there is significant variability among facial recognition vendors and paired with findings that then there's will rely on a variety of unknown sources points to an -- an urgent need for disclosure and perhaps restriction of what private and systems are being used. aside from variability and inaccuracy across algorithms, law enforcement uses racial
3:58 pm
recognition to identify mental suspects, and raises special accuracy concerns because it is particularly likely that less than ideal images will be used. studies of facial recognition find low error rates only when using cooperative images, declines in accuracy are seen when using images not captured under ideal circumstances, precisely the types of images used in criminal investigations. finally none of this might give , to a rise to intervention if the criminal legal system already led to the adequate use of facial recognition. it does not. partly this is due to the fact that as long as facial recognition matches are used as investigative tools and not relied on as evidence, there is little opportunity for a defendant was identified to challenge the practice in court . however restrictive discovery, , the sharing of case information between the prosecution and defense, means the defendants are severely hobbled in litigating use of facial recognition technology.
3:59 pm
defendants in federal criminal cases do not have access to the investigative file. they may not even learn that a facial recognition procedure was informed. moreover, the timing means little opportunity exists for the defense to vet the reliability of evidence before entering a plea. the defense claims they are constitutionally entitled to information about facial recognition under the holding of the case brady v. maryland, have gained little or no traction in courts. in sum, facial recognition is an alluring forensic tool that has the potential to accurately open otherwise unavailable investigative avenues, but also has the potential to inject a new source of error into criminal adjudication. troublingly, the federal criminal legal system is not welldesigned to smoke out questionable uses of frt through the ordinary operation of processing.
4:00 pm
>> i recognize brett tolman for five minutes. >> thank you for the opportunity to testify today. my name is brett tolman, and i am the executive director of right on crime, a conservative organization dedicated to the promotion of criminal justice policies that promote public safety, individual liberty, and whole communities. i have previously served as the united states attorney for the district of utah, as an assistant united states attorney, and as chief counsel for crime and terrorism for the united states senate judiciary committee. many of the fundamental questions you want and deserve answers to are currently unanswerable. questions such as how many law enforcement entities use facial recognition, who is in their databases and why. to the extent that we have partial answers to some of these questions, the facts are daunting. recently, the
4:01 pm
government accountability office revealed that beyond those federal law enforcement agencies one might suspect of using facial recognition, like the federal bureau of investigation, there are a host of others with less apparent need who also use the technology. for example, the u.s. fish and wildlife service and the internal revenue service. nationally, one estimate placed the government market for facial recognition in this country at $136.9 million in 2018, with the country at $136.9 million in expectation that it would nearly triple by 2025. billions of photos are posted by unsuspecting users. this is an unprecedented invasion of privacy that places enormous control undue control in the hands of the government and big tech, two entities not always known for their light touch or responsible use of power. the perpetual path of surveillance, law enforcement can potentially add recordings from body-worn cameras or simply from the smartphone in an officer's pocket. in short, there are very few instances where law enforcement will not have the opportunity to subject a person of interest to facial recognition technology. inevitably, the first temptation will be to use facial recognition by default rather
4:02 pm
then necessity. instead of limiting the practice to serious public safety risks or only after due process has been afforded an individual, officers may use and have already used the practice for run-of-the-mill interactions and minor cases. our founding fathers deliberately and prudently enshrined in the bill of rights proscriptions on the wanton search of americans as a necessary bulwark for freedom. it is hard to square these notions and protections with the unfettered use of a technology that can instantaneously reveal an individual's identity as well as potentially their associations in prior movements. the unrestricted use of facial recognition bears little difference to allowing police to collect fingerprints or dna from anyone and everyone they come across, an abuse that we clearly do not, and should not, tolerate. furthermore, we cannot ignore the risk that facial recognition technology will be used to target certain americans. facial recognition can instantly strip away the anonymity of crowds. consider the chilling effect facial recognition could have if
4:03 pm
used to identify people at a political rally or government protest. we have little more than vague assurances that we can trust the user -- the government to use it without oversight. i was test with leading the effort to reauthorize the patriot act. we heard similar assurances years ago, by those leading the department of justice and the fbi about fisa and those surveillance authorities not being turned against honest americans. we have seen how that worked out as outlined in the recent, and disturbing, inspector general reports. none of this is to say that law enforcement should never have access to facial recognition technology. while china's unconscionable use of facial recognition technology to enhance and accelerate its undemocratic control of its citizenry is a warning of the costs of failure, i do not believe it is the inevitable consequence of any use of facial recognition technology by law enforcement. further, it is
4:04 pm
unrealistic to expect law enforcement officials to permanently deny themselves a tool that is increasingly prevalent in the commercial sector and which has such powerful capacity to improve public safety. it is easy to contrive a scenario in which we would very much want law enforcement to have this technology searching for , a known terrorist loose in one of our cities presenting an imminent risk to the public, or seeking to identify a mass-murder suspect, for example. however, acknowledging their critical -- credible uses for facial recognition technology and voicing support for law enforcement is not the same as writing a blank check for power and then looking the other way. if nothing else, we need a reset in which law enforcement agencies must gain affirmative permission from relevant democratically elected representatives to use this technology prior to its use; that way transparency, accountability, and proper use guidelines can be established. i look forward to answering your questions. >> we thank you for your testimony.
4:05 pm
it's my pleasure and privilege to recognize dr. cedric alexander for five minutes. >> i hail from the great state of florida, pensacola more specifically, so it is great to be here with you. i'm going to try not to be redundant, because maybe some of what i'm going to say has already been mentioned. in the time that has been allocated. let me start by saying this, on july 9, the portland, maine, press herald reported on two newly enacted laws that make maine the first us state to enact a broad ban on government use of facial recognition technology. in maine, state, county, and municipal governments will not be allowed to use any sort of frt.
4:06 pm
law enforcement may use the technology for investigating certain serious crimes, but state law enforcement agencies are barred from implementing their own frt systems. they may request frt searches from the fbi and the state bureau of motor vehicles in certain cases. a year earlier, in august 2020, the portland city council totally banned the use of facial recognition by the police. indeed, currently some 24 american cities ban police frt use, and many other organizations have called for a nationwide ban because of evidence that false positive identifications in the case of people of color exceed the rate for whites. so far, the federal government has enacted no laws on how law enforcement may or may not use facial recognition technology. the benefits of facial
4:07 pm
recognition systems for policing are quite evident. some have already been mentioned here. it is a fact that the technology can aid in the detection and prevention of crime. for example facial recognition , is effectively used when issuing identity documents and is usually combined with other long-accepted biometric technologies, such as fingerprints and iris scans. frt face matching is used at border checks to compare the portrait on digitized biometric passports. frt, as we all know, was used to identify suspects in the january 6 violent breach of the u.s. capitol, and high-definition digital photograph and videography sometimes deployed from aerial drones may be used increasingly , to identify faces in mass events.
4:08 pm
currently, the fbi is the leading law enforcement agency using frt, and it has developed technologies and best practices to promote the intelligent use of the technology to reduce errors and protect constitutional rights. in addition, the facial identification scientific working group, operating under national institute of standards and technology organization for scientific area committees, is working to develop standards in frt. facial recognition technology has been useful in law enforcement and, i believe, will continue to develop technically and therefore become even more useful. blanket bans on frt in policing are unwarranted and deny to police agencies a tool that is an important aid to public safety.
4:09 pm
but make no mistake, there are urgent constitutional issues relating to privacy, protection from unreasonable searches, due process, and presumption of innocence. especially concerning our false positives and negative identification results. in addition, police do not always use frt correctly. for instance, on may 16, 2019, the washington post reported that some agencies used altered photos, forensic arc sketches and even celebrity look-alikes for facial recognition searches. in one case, it was written new york police detectives believe the suspect looked like the actor woody harrelson. so they ran the actors image through a search and arrested a man that system had suggested
4:10 pm
might be a match. forgive me for being blunt, but that was pretty egregious, quite frankly. using artificial technology to confer upon a highly subjective digital impression or halo of digital certainty is neither fact-based or just but under federal law, at least it is not illegal. it is not illegal for the simple reason -- i'm going to stop here and will be more than glad to answer any questions that you may have, particularly from a very practical perspective as being a former chief into major cities in this country and a former federal security director for the u.s. department of homeland security in dallas, texas, for 5.5 years. so i understand the importance of this technology, but i also understand how it can be easily abused if those are not properly
4:11 pm
trained, certified in the utilization of practices are not evident. thank you, madam chair. >> thank you to all the witnesses. we obviously have opened up in the parochial and often used term nonlegal a can of worms that we need to address and we will now be able to proceed with members questions. i am grateful for the attendance to all members both on the minority side and on the majority who will now proceed under the five-minute rule of questions. i will now begin by recognizing myself for five minutes. director goodwin, you know of the gao report 2019 indicating that facial recognition algorithms detect
4:12 pm
accuracy in a real way by race, ethnicity, or country of origin as well as gender. what are some of the concerns related to facial recognition system accuracy? >> thank you. a number of concerns have been raised in our report concerning accuracy and i will focus on one. think about a person's face. it is distinct, it typically does not change. if a person's face, if that information is breached or hacked, that is totally different than your password being breached. you can change your password, but you cannot change your face. in the report, we noted this concern. the misidentification based on a facial image is of great concern. that is in our report. >> thank you. mr. williams, you were wrongfully arrested in front of
4:13 pm
your wife and daughter, detained for hours because of facial recognition. your study is stunning. holding up a picture alongside of your face, your comment do we all look alike, you were suspect allegedly in an investigation. prosecutors dropped the charges and the detroit police chief apologized for sloppy detective work. tell us what impact your detention had on you and your family. you're a gentleman with employment and family. how did that impact you? >> thank you. the first thing when i came home on the drive home, my wife was saying how she had to figure out how to get in contact with the detention center because it kept hanging up on her and telling her they can't answer any questions because they don't have any information just that i
4:14 pm
am there and that's all they could tell her. when i got home, i asked my wife what was going on and my daughter had took pictures of us and turn them around because she said she could not bear to look at me while i was in jail. i had to go talk to a five-year-old at the time about a mistake that had been made by the police and it is hard to make her understand that. sometimes, they make mistakes. you tell her as a child that the police are your friends and if you ever get in trouble, this is who you go to and this is who is going to help. we thought about taking her to a psychiatrist because when we replay the video from what happened, she cannot stand to
4:15 pm
watch it. she gets emotional. it is understandable and i have to go through that. it's painful. thank you. >> thank you. professor friedman, my time is short and i have one or two other questions. what are the privacy implications and risks to communities if a private company has access to details of law enforcement investigations? you might add your thought about a legislative fix about how we address this question. professor friedman? >> that was an incredibly moving story. i have kids, and i can only imagine. the problem as i think mr. alexander pointed
4:16 pm
out, there is no regulation. we are going to have mistakes, all kinds of trouble. there are uses of the technology for law enforcement, but we need a full set of regulations. i set out a bunch of them in my testimony. i think you could do a lot of this by regulating the vendors directly. you could set a number of standards including that the companies themselves have to regulate and identify what are effective images. the size of the databases, the threshold to avoid false positives or false negatives. that the technology is self auditing so we know how it was used by law enforcement. a lot of things that can be done. >> thank you. dr. alexander, what are your suggestions on how law-enforcement authorities can reduce the phraseology that the detroit officials used sloppy detective work.
4:17 pm
without greatly reducing investigative efforts. dr. alexander? >> thank you, madam chair. having been a former chief, that would suggest in me that i have a larger systemic issue on the way investigations are conducted and people i've interviewed. here is the issue for me. the technology is one in which we use every day. we pick up our cell phone, we use it to access calls or text messages. whatever the case may happen to be. the biggest issue here for me and will continue to be when you have victims such as mr. williams, people who are innocently going along their way, because of failed ability for us in policing, the inability to train people appropriately. they have to be trained, there has to be certifications. there has to be best practices. there needs to be certification
4:18 pm
i believe at a state level and a federal level that has to be ongoing. because one thing that we know currently is that we don't train enough. when you're utilizing that type of technology that can easily infringe on someone's fourth amendment rights, i think we have to be very careful. we also have to remember where we came from even in dna technology. there we have seen over the years particularly in the past, we have seen dna labs shut down because of people's inability to carry out that professional function based on certification and accuracy. we don't want to find ourselves using this technology doing the same things that we have done in the past, because what it will continue to do is drive the wedge even further apart between good policing and communities
4:19 pm
across the country and , that is not a road we want to continue to go down. >> thank you. my time has expired. i know that there are many other questions that i have interest in. let me yield to the ranking member of the subcommittee for five minutes. >> thank you. ms. goodwin thank you for being here today. in your most recent report, you found that 13 of the 14 agencies you surveyed, that reported using nonfederal systems did not have complete up-to-date information about what systems their employees were using. is that a correct interpretation? >> yes. >> even one of the agencies you worked with had to pull its employees -- poll its employees to see what systems they were using. >> that is correct.
4:20 pm
one of the agencies initially told us that they did not have the technology. when they polled their employees, they found out it had been used in over 1000 searches. >> if agency leadership does not even know what systems their employees are using, how can they be sure they're not being misused and that the constitutional rights of american citizens are not being violated? >> this speaks to the heart of the recommendation that we may. -- that we made. that agencies have an awareness of what systems they are using, particularly the nonfederal systems their employees are using. given that awareness, then they can begin to ensure that appropriate protections are in place related to privacy and accuracy. >> ms. frederick, we know that agencies are using private searches and they are collecting those images.
4:21 pm
how are those images acquired? >> it depends on the private company. a lot of these companies, they claim to scrape the internet, social media sites and they are scraping the internet where people are uploading these photos. clicking yes to some of the terms of use on these private companies. people are not consenting these images to be used by law enforcement, but by using these contractors, they are using your regular photos from your grandmother, your family, your sisters, your daughters. it is pretty much a government infused wild west out there.
4:22 pm
>> mr. tolman, these private systems, let's leave out federal systems for a moment, or law-enforcement systems, what steps are they taking to secure their systems? >> the reality is, they are more interested in the dollars that are behind this technology than they are securing civil liberties. that will always be the case when private companies are contracting and utilizing a powerful technology. their interest is going to be driven by their bottom line. >> these agencies like federal agencies that are using private services, is it just easier for them to contract out than to develop their own system, i assume. >> yes, the capability is hundreds of millions of photos versus what they would be able
4:23 pm
to gather themselves with limited staff and resources. >> how are we going to ensure that americans are not being targeted by federal agencies using this facial recognition technology? what are your policy prescriptions? >> there is going to have to be a complete shift, meaning the legislature is going to have to enact regulations that require any agency, whether they are contracting or not with big tech or other companies, they are going to have to have policies that allow for the proper legal use of this. i prosecuted the kidnapper of elizabeth smart. i imagine during that case when i was a prosecutor, we would have clearly thought out law enforcement's ability of that to try to find her. that is a far cry from downloading all from social
4:24 pm
media what you can then targeting others without the legal justification or the regulations or policies have been approved by congress. >> mr. lee, i'm good to ask you the same question. what policy prescriptions do you advocate that we can ensure americans are not being targeted by federal agencies or other law enforcement agencies? >> thank you for the question. we support a ban or moratorium because these communities where facial recognition technologies are being used are being over policed. facial recognition technology is disproportionately inaccurate for black and brown communities. the community has not been involved in these processes. congress has not engaged in legislation on these topics. communities have not been involved in highlighting what
4:25 pm
they think policing should be specifically within the context of facial recognition technology. >> thank you. my time is up. >> thank you. i now recognize mr. nadler the , chairman of the committee. >> thank you. on your report you found agencies using systems owned by 15 another federal entity and 14 reported using systems owned by state, local, tribal entities including the fbi. can you describe how the fbi uses these systems to expand facial recognition search capacity? >> thank you for that question. i will briefly talk about how the fbi used the technology around the events of january 6. the fbi has a media tip line.
4:26 pm
therefore people -- they asked for people to submit images. they then inserted those into a frt system to get potential matches. we did a report a few years ago on fbi specific use of the technology and we had issued a number of recommendations that spoke to them complying with the privacy impact assessment and another system. the fbi ultimately -- they agreed with our recommendations and they implemented the recommendations. i think it was dr. alexander had mentioned that the fbi has done a number of things to ensure the accuracy of the frt systems they are using has improved.
4:27 pm
>> also, the report described how agencies use nongovernment systems. what are the risks associated? >> as i mentioned earlier, the privacy impact assessment, those are two standards that typically if a government agency is using their own system they have to follow. the concern that we mentioned is that if they are using on nonfederal entity for facial recognition technology, that nonfederal entity may or may not have to comply with the recommendations and the guidelines laid out. that is the concern. that speaks to how the images are being used.
4:28 pm
the other speaks to the fact if there is a change in the technology, that has to be made public and available. >> professor laurin, for those using the technology to match to a suspect photo, what technology is used to challenge the use of the facial recognition technology that led to the arrest? >> in many instances today, it is quite limited the opportunity to do that. practically, nonexistent to the extent that the practice has been for law enforcement agencies to use the facial recognition match as a lead that is corroborated through other evidence. the rest of the cooperation ends up not being independent corroboration.
4:29 pm
for example in a case in florida, a match was made based on an image in a store and a police officer who knew that the match had been made looks at the image himself and says these guys look alike. already knowing that the computer had assisted a match , doubled down on it. which is not an independent cooperation. the courts said they don't have the right to challenge the facial recognition match when the government isn't relying on it in court but that misses the , fact that the match could taint what is showing up in court. >> facial recognition systems have been found to be less effective in detecting people of color and women. as we heard from mr. williams, he experienced this flaw in real time when he was misidentified despite his photo not resembling the suspect. can facial recognition technology be incorporated fairly into criminal
4:30 pm
investigative and public safety strategies despite this flaw in the algorithms? >> festival, it is a pleasure to see you again especially under the circumstances. secondly the leadership , conference does not. currently, the issues of accuracy alone would hold that a ban or moratorium would be the most appropriate use. in addition, these technologies are being used in communities that are already being over policed and over surveilled. there is no way to get around the accuracy and over surveillance issues that come with law enforcement use of facial recognition technology. >> mr. lee? >> chair i want to echo the , grave concern over the technology that has these
4:31 pm
demographic differentials, i said in the past that they should not be used unless that can be addressed. i joined mr. tolman. there may be ways to deal with the demographic disparities if we understand what they are but , we don't have that testing so we have learning to do. we have to think about what it means to have a human in the loop. many people offer that up as a way to ensure people don't go warm, but as mr. william story demonstrates, they don't do that. it might be possible, but we have not made that happen yet. that is why we so desperately need regulation. >> before i yelled back i ask , unanimous consent to place the following into the record. a statement from over 40 civil rights group a letter from the , security industry association, individual statements from data for black lives, the electronic frontier foundation and project on government oversight. >> without objection, so
4:32 pm
ordered. >> mr. chairman, your recognized for five minute. >> i would like to lead off by thanking our ranking member, mr. jordan for allowing me to go in front of him, i have a noon speech and it will take me a while to get there so i appreciate him switching spots with me. in addition to being a senior member of the judiciary committee for quite a few years, i also serve on the foreign affairs committee and i am the ranking member and past chair of the asia and pacific subcommittee and in that capacity, republicans and democrats have been very closely watching, observing, monitoring what the prc and chinese communist party are up to and
4:33 pm
the threat that they pose to americans and their citizens. it is well known, i think it was mentioned earlier, i think it was noted in the statement of a few witnesses, the chinese communist party uses a vast civilian surveillance network to track nearly every movement of every single person and there is 1.4 billion people who they are tracking. it's a pretty extensive effort that they have undertaken. for example, they forced their citizens to download apps that monitor them. they track social media posts to look for what they consider to be negative content. they stop people at random to check their devices for apps deemed to be dangerous to the state. they use facial recognition
4:34 pm
technology similar to what we are discussing today to pick individuals up that they are looking for in crowds. all of this is in order to build a so-called great firewall between china and the rest of the world. so they can keep their people under control and to gather personal data on their citizens which can be flagged to torment the people of china, individuals they believe are going to be unhelpful to the regime. i would like to ask ms. frederick and mr. tolman, would you care to comment on china's use of facial recognition, other technologies to monitor their citizens and how the ccp is using facial recognition technology to oppress individuals and various groups,
4:35 pm
the uighur community comes to mind because they have been in the news -- not nearly enough of late when you consider that a million people are there. are there lessons that we should learn from this? i am not suggesting that our government has anything like this in mind, but i would like to hear from you if we could. >> thank you. i will begin if you don't mind. yes, we have a system in the united states that matters. sufficient rule of law protections, open and engaged citizenry, a free press that act as guardrails for now against the propagation of technology for political ends that are anywhere near like china. it is important to look to them. they are the leading edge of using these technologies for internal control and stability.
4:36 pm
they pervade a path for ethnic targeting in the western region where 1.5 million people are imprisoned. they use something that human rights law has reverse engineered. the integrated joint operations platform. they take all of this data, they fuse it. it is license plate readers, data doors these things that , pick up sim cards, biometric iris scans, the digital exhaust that we are constantly emitting. they put them altogether and they use this information, they use new technology, artificial intelligence, this is one of the first places they started using it for political identification of the minorities. what i see coming down the pike
4:37 pm
is the use of dna genotyping. china is already trying to do this. they are trying to use an individual's dna to predict what their faces will look like. they are experimenting with voice recognition technologies and the idea is to combine all of this into this digital, to use it for basically an iron hand of digital control. we have the technology to do so. we have seen in hong kong the diminishing of freedom. how does a democracy die? in full view. china is using technology in order to enact it. russia is doing the same thing when it comes to protests in january. all these things are something to keep in mind. as the new york times said, something i would leave you with what china is trying to do is , total lies -- totalizing
4:38 pm
control. they're trying to link digital reality with physical reality and that chasm that used to exist is growing closer together. china wants them to be completely interconnected. we have to guard against that here in america. >> madam chair, my time is expired perhaps mr. tolman could , respond in writing so i don't hold the committee up at this time. >> thank you for yielding back. i now recognize the gentlelady from california for 5 minutes. >> thank you. let me thank the ranking member and our witnesses for being with us today. i wanted to know if the witnesses, if anyone could speak to distinguish between using facial recognition to investigate versus facial
4:39 pm
recognition for arrest? is it more productive to use it that way versus all of the complications for arrest? it is very well known and has been said by numerous panelists and members of the challenges of facial recognition people of color. i am not sure unless i missed it, does it work better if it is white? how does it perform? mr. alexander? >> i will try to respond to the first part of your question whether the technology should be utilized with the idea of making an arrest based on the identification itself. can it be used as an investigative tool? when i think about it as an investigative tool, if that person's image happens to be a match or similar match and we
4:40 pm
are working with zero leads in the case, it may prove to be of some value. the thing we have to be very careful of, and i think someone mentioned it earlier and articulated it better than i can is that we have to make sure that we cognitively don't approach that individual already with our implicit bias in mind that they are the exact person we are looking for. do you follow what i'm saying? we have to be very careful with that. ideally, i was here two years ago under the leadership of another hearing with congressman cummings having these conversations and i recollect clearly there was a woman there who was a researcher at m.i.t. who spoke elegantly around these issues and was able to provide
4:41 pm
insight in terms of the depth of this technology, in how it does not work to the advantage of people of color and women. there is an overwhelming amount of evidence, and i think there are those on this panel who can talk further about this, where in the case of white males there was least likely opportunity for misidentification. certainly those of color and women, it tends to be more problematic. >> you said at least likely, meaning that it is more accurate with white men? >> yes, it is more accurate with white men. >> maybe someone else could respond to that. what about that, in terms of accuracy and also its use as an investigative tool? in the george floyd act, we say
4:42 pm
that there should not be federal resources in facial recognition. >> i was just going to say about the question you had previously asked that it was supposed to be used for an investigative tool. at that point, they came and arrested me and did not even tell me anything. i only live in detroit. in detroit, police came to my house and carted me off. i don't have a lot to say about that, all i am saying is the way it is being used, they are not doing any due diligence at all. like the chief said, it was shoddy investigative work done. >> to me, you are an example of
4:43 pm
what i was not saying. they used it for your arrest. that's clear it is problematic, but i was referring to using it to investigate, not in real-time for an arrest. mr. friedman, do you have anything you would like to say? >> i think there are things that this body could do regulatorily. the kind of testing that flex what is happening in law enforcement use rather than a set of abstract possibilities for the accuracy and bias. >> thank you.
4:44 pm
>> thank you. mr. ranking member, i am moved to say it appears like unreasonable search and seizure under the 4th amendment. i will not take up your time. i am delighted to recognize mr. jordan for 5 minutes. >> thank you madam chair. let me go to mr. tolman. how often do we get the wrong person? where mr. williams told his compelling story of how they were wrong and he was arrested and detained for 30 hours. i know one family, a story of a couple in alaska who were not in the capitol on january 6 but they were handcuffed during the interrogation. they were at gunpoint. it was all based on facial recognition. how often do we get the wrong person? >> one time can be too many, but some estimate as high as 30%.
4:45 pm
i actually believe it is more than that. i have represented two individuals, one of whom and i am fairly certain -- we don't know about the ones who aren't publicized or coming forward, but i have a client who was accused of going inside the capitol, he was not in there, i believe it was facial not, recognition technology that put him in the law-enforcement investigation. we had to work backwards to show that he was not. i think it is more often than we know, because it is not being reported. >> mr. lee, he said 30%. we know that it is a disproportionate effect on african-americans. do you think it is a higher number, is that probably, 30% of the time this technology gets
4:46 pm
the wrong person. >> thank you you for the question. we don't know. that is part of the problem. these technologies lack the requisite transparency and oversight. in cases like mr. williams, those are terrifying. what about the cases that are on the edge that we don't know about right now? we do need more transparency and oversight, specifically with facial recognition technology and the use by law enforcement. >> thank you. i will yield to our ranking member. i am concerned in a general sense of the overall surveillance state and individual americans, the fundamental liberties and fundamental rights. when you add on top of that, it is wrong so often, to meet this is why we can't, if we can't come together in a bipartisan fashion, we have been trying for a couple of years, if we can't do that and find a way to limit this, particularly when we have these agencies using the
4:47 pm
technology they don't even know , what technology they are using and whether it is private or what have you, there's a host of problems here. i appreciate the chairman having this hearing and i'm going to yield the remaining two minutes to the ranking member. >> thank you. peachtree corner in georgia used ai driven smart cameras to monitor social distancing. social distancing and covid-19. with current technology, what prevents federal agencies or state and local governments from using facial recognition not just for identification, but to track movement and behavior of law-abiding citizens? >> i hear that's what's coming down the pike. right now, there are specific transparency measures and lukewarm strictures on some of these uses. ms. goodwin talk well about the
4:48 pm
recommendations, those are great start points. at this point, there are no hard constraints on the expansion of a remit for these cameras. the public safety -- it could encompass all sorts of things and clearly certain municipalities are taking advantage of that to interpret public safety safety as they wish. thomas federal entities are required to submit those reports that don't yet exist in a rigorous fashion, i hear many -- i fear many official entities will be allowed to interpret the use of facial recognition, digital surveillance in any way they can. i will say quickly when i was at facebook, that you those of private companies -- the ethos of these private companies shipping these products, they want to ship them as fast as they can. saw problems in later
4:49 pm
iterations. >> i need to ask one more question. i want to talk to professor laurin with regard to the question on investigation versus arrest, and what has come to my mind is this notion of tainted photo lineups, tainted identification being used in investigation as predicate for making an arrest. did you want to elaborate on that? >> thank you for the opportunity. i will say a bit more, to say that i do think it is possible in theory to think about a distinction in the use of the technology as an investigative tool versus forming the basis for an arrest or put differently, of facial recognition match providing probable cause.
4:50 pm
from a regulatory standpoint, one needs to take care about what strictures need to be put in place to create a meaningful distinction between those two things. for example there are a number , of agencies that say facial recognition itself cannot be the basis for probable cause. there are very few if any strictures around what further investigation is required to build an independent case. are there for example blinding procedures so that individuals who are doing a subsequent lineup don't know the facial recognition initially matched the individual? if not, that is a meaningful -- not a particularly meaningful independent check. the last thing is there is an important intersection here between facial recognition technology and what we already know about the fallibility of eyewitness identification. which in over 75% of known dna exonerations, there were now
4:51 pm
known to be wrong eyewitness identifications that occurred. that is an investigative tool that itself is already quite fragile. when used as a corroborating piece of evidence with the fragile facial identification, that should give reason for concern, as well. thinking about what strictures need to be in place creating a meaningful check on the investigative news. >> my time is long expired, thank you. >> thank you very much. it is an issue that warrants a lot of involvement and we thank our members and i am delighted to recognize the gentlelady from florida who has 5 minutes. >> good morning to all of you.
4:52 pm
thank you for this very important discussion. thank you to all of the witnesses for being here. it is good to have a fellow floridian on the panel. as a former police executive, i am always interested in new technologies that can help to keep the public safe, help to keep law enforcement safe, and improve the delivery of police service. we know that we have work to do in all of those areas. it was a decade ago that we were testing body cameras at the orlando police department. while we were not sure of the road ahead, i know the public was very excited about that possibility. certainly it is one of the recommendations that is coming to the forefront now. we should always seek technologies that will enable us to better do our jobs.
4:53 pm
dr. alexander, you said that technology has long outpaced the chance to govern it. that is such a powerful statement, because not only does it appear that the use of facial recognition outpaced our ability to govern it, we see that with other technologies as well. dr. alexander, you talked about best practices. until we can find ourselves at a point where we can keep up through regulation and other things, can you expound a little bit on some of those best practices? >> thank you very much. i think it is very important for us to recognize that technology does move along quickly. if we think about the development and implementation of technology over the last 10
4:54 pm
to 20 years or even last 5 years, it often times outpaces us as human beings. we are not prepared oftentimes for the unintended consequences that come with the technologies we are all utilizing every day. we have to develop best practices. what are some of those best practices? i think we can only clearly define once we make some determination about the real legitimacy of this technology. it can be innocently used with our cell phones to open it up and conduct our personal business. when it comes to the constitutional right of american citizens across this country regardless of where they are, , what neighborhoods they are in that becomes more problematic. , can we use it as an investigative tool? you know as well as i do, we might embark upon an
4:55 pm
investigation when we have zero leads. maybe we have a piece of this facial recognition that suggests -- could that be cedric alexander? what we can do is what happened -- what we can't do is what happened to mr. williams. let's look into the background of cedric alexander further. let's explore some other things that we didn't think about even had a suspect at all or person of interest. the problem lies there, as i have already stated, sometimes we can identify an individual that may be a likely person of interest or suspect, but if our own implicit biases get in the way, it might overwhelm the entire investigation and it leads us down this dark path of where now
4:56 pm
we are doing injury and hurt to people. here is the thing i would mention to all of you in congress. as we go through this place in america right now, around police reform, one thing we do not need is a piece of technology that is going to continue to drive a separation between police and community. people want good police. we all want great technology. the idea around this technology is its great, it's well-founded. it has to become more accurate. it has to have the type of legitimacy, trust in this technology that is needed in order for the american people to feel like it is not going to infringe upon the rights and hurt people in any type of way. i sit on the national innocence
4:57 pm
project board. i understand the importance of dna technology and how it is used to free individuals who never should have been incarcerated, like mr. williams. we have to do a better job in advancing this technology, but understanding that sometimes, the technology moves along faster than we do as human beings. we have a lot of catching up to do around many of the social challenges we have in our great nation. i hope in some way, that addresses your question. >> it did. thank you so much again. thank you to all of our witnesses. with that, i yield back. >> thank you very much. we now recognize the next figure for five minutes. >> thank you, madam chairman. this really warms my heart to hear witnesses from both sides having so much good input.
4:58 pm
it is my observation during the bush years that the department of justice was able to convince republicans, a majority of them, we don't need any reform right now and really pushed back and convince republicans some reforms we should have made, i noticed resistance like that. i know a lot of pressure on the obama administration on democrats that were in charge not to do some reform that was needed. when i hear the testimony and the questions and hear this in this is such a great bipartisan way, something so serious, it really does warm my heart.
4:59 pm
great comments and great questions. dr. alexander, i note you are careful to make sure that you note this technology can be helpful and it gets me thinking about our fbi. you think the technology on recording statements, instead of the fbi having their agents not record statements, audio or video, or write down their version of the witness -- do you think the technology has gone too fast for the fbi, that may be we can get them to start recording with audio, at least, maybe video? and i am being sarcastic, but that is an area of concern for me. if you combined facial recognition mistakes with the fbi's ability to only record, only write down what their version of what the witness says, you combined flaws like that, we are looking at serious problems at the federal level. local, i am sure that your law
5:00 pm
enforcement in which you oversaw recorded statements of, that would not be such a problem, but can you see regulations, dr. alexander, that would help us be able to confine facial recognition just to the investigative stage, so that, and as a former judge, you know, i know sometimes law enforcement will come in and go, judge, this is an affidavit, but we, if you did not swear to it, i do not want to hear it. but that discrimination of real sworn evidence to unsworn evidence is not always distinguished by judges. do you see as actually getting teeth into that? >> thank you for that question.
5:01 pm
let me say, i think the fbi has the tendency, has the ability, their training, and also has the fidelity of a federal law enforcement agency that is well trusted. everything they have pursued, particularly around technology, they really stood up on technology if you go back 100 years, in terms of the work they did back then, and they do today, forensically. but they do have great capability of being able to access that type of technology and use it in a way that is standardized, one in which they really are the leaders around the utilization because they use this technology to go after foreign actors who want to do harm to this country, or to utilize this event domestically.
5:02 pm
>> my time is short, but i am very concerned about some of their abuses. and when we see the fisa court that meets in secret, and there does not seem to be checks. like i say, as a former judge i am very offended that some judges have not been outraged, like mr. williams's treatment and others. i think that is a problem, when the judiciary is not upset enough to actually take action and bring law enforcement into line. but let me ask mr. tolman, do you see a way to get specific regulations that would discern between investigation and actually going to court and getting a warrant, and picking up somebody like mr. williams? >> i think that you could
5:03 pm
implement things like not allowing further technology to launch an investigation, that they would have to have an independent basis to launch the of a signature, than if it went to trial you go to utilize the facial recognition technology as additional evidence, but you cannot have it the other way around, then you have concerns if the evidence corroborating the facial recognition technology is influenced by the subjectivity of the investigators. >> thank you. my time is expired. thank you. this is an opportunity for us to move forward. >> i will recognize the gentlelady from georgia for 5 minutes. >> thank you so much. i really appreciate you today.
5:04 pm
and thank you to each and every one of you today, thank you so much for your your timely resources, information and expertise. i want to start with a quick question. dr. alexander, looking at how we might regulate technology, how can we build trust between law enforcement and of the communities that they serve, especially communities of color? >> fundamentally, that is where a lot of the issues lies, because currently in this environment we are in, this is not new, this has just been exacerbated over the last several years. there's a great deal of mistrust between police and community. any time, if you are already dealing with that in the current context of where we are right now in this country, but as we continue to introduce new
5:05 pm
technology and ways of doing things that are questionable, such as facial recognition, if we do not do it right and we end up with continued victims like mr. williams, and we continue to, to create that separation where legitimacy around policing continues to be questioned, and this is the fight i find myself fighting every day in the roles i play now, is really trying to deal with these relationships, but it certainly has its challenges associated with it. but we have to get the technology right because if we don't it will not help the cause in building those relationships. if we do not have that collaboration between policing and community, i do not care where it is, we will continue to fight ourselves here, and we will continue to be at risk. >> thank you so much and stay in the fight. professor, your testimony said the rules of discovery, the sharing of information between the prosecution and defense are
5:06 pm
restrictive, so prosecutors do not have to mention which technologies they may have used to investigate and if the courts do not get much opportunity to advance the law by sorting out what is except the bull and unacceptable uses of this technology. could different discovery rules help with this kind of problem? >> thank you so much for the question. i think the short answer is yes. and just at the risk of going a little bit into the weeds, one thing to understand about criminal discovery in the u.s., just like criminal law generally, is there is tremendous diversity of systems. we have at least 51 criminal justice systems in the u.s., given the states and federal government. so if you look across of those systems, the federal criminal discovery is among the most restrictive. rule 16 is limited in terms of
5:07 pm
what information the government is required to disclose to the defense. there are a number of states that have gone quite in the opposite direction, new york most recently, but my own state of taxes, texas has moved into an open file discovery system, and under that system the defense typically would have access to essentially the entirety of the nonprivileged investigative file, would know the steps taken by the investigators, regardless if the prosecution was going to present that evidence, regardless if it is going to be favorable to the defense. in addition to the points mr. tolman was making about regulations you would want to have on how evidence can be used, in particular thinking about the federal system, thinking about opening discovery more with regard to facial recognition technology is really an essential part of achieving transparency and really vetting the reliability of this.
5:08 pm
>> thank you. professor friedman, should congress regulate private and government owned systems? i guess i know that the report shows, as others have noted, that facial recognition systems owned by the government, as well as those owned by private companies and used by government actors, are not always regulated. does ownership play a role in how these systems are used? >> thank you for the question. i think it is essential to have regulation. i think you are hearing all of us say that there should be regulation. the likeliest way for congress to regulate is through the vendors, the manufacturers, those who are taking them into commerce. you want to regulate the use of those technologies by law
5:09 pm
enforcement agencies, although there has been discussion about trust between communities and law enforcement. but you will not see that until there's regulation for how the technologies are used with a strong guard rails. i could say more but i see that your time is expired. >> thank you. i yield back the balance of my time, which there is none. thank you. >> i now recognize ms. tiffany for 5 minutes. >> thank you very much. mr. frederick, i want to shame, -- frame my question with the background of the biden administration. it has been reported to be coming out that they are going to enlist the wireless phone carriers to vet people's text messages. i mentioned that because some of my constituents believe that we are a short leap to the communist chinese party's use of
5:10 pm
technology in their country, where they have created a social credit system. are my constituents right to be concerned about where we are at in the united states? >> well, i will say somebody who has worked in the intelligence community for almost a decade, i think that your constituents are right to be concerned about where the technology takes us. what i think is disconcerting is what you mentioned, sms content being parts, email, podcasts have been mentioned as having a portion of unfettered conversations, and these are bastions of our free speech. and i think it is of concern to see where authoritarian governments are sort of taking us. again, we do have a system of guardrails, but it is important, especially for conservatives and
5:11 pm
those expressing concern to views to be vigilant and see where this technology goes and guard against it. i think we are all abrogating for certain guards in most >> dr. alexander, you mentioned the state of maine, in particular portland, maine, and we also heard from mr. lee that certain communities have not been involved. isn't there a role for city councils, state governments, those governments at the local level to get involved with this, too? because they are responsible for their police forces in all instances, correct? maybe you are muted.
5:12 pm
>> yes, you are absolutely right, because there at the local level is where it all starts. local elected official, mayors, city managers, county managers, whatever the case may be, they certainly need a voice in this, because what you want to always make sure of is that your police agencies are operating at the highest level of training and certification. and we do not have that yet across the country, that is why you find some communities backing off of this technology until more concrete findings are done. >> i think about local officials, i have many constituents that work in the twin cities, and we saw a real failure on the part of the mayor of minneapolis and the city council in addressing the riots that overwhelmed minneapolis last year. a lot of it is because they did not take control of their police department. there were a few bad cops and
5:13 pm
they needed to take care of them, and they did not do it. this is the same thing, they need to address the technology. and this is not just minneapolis, there are many cities and states across the state. they have their purview and obligation to oversee their police departments, and they should do that, including in regards to using this type of technology. the final thing before i yield. this is an example of the limits of technology. i think about simple things like with children, they get better reading retention when they read off of paper versus off an electronic format. and i think about the police the same way, you cannot replace boots on the ground and good investigative work with technology in all instances. >> that is correct. capitol police have been given
5:14 pm
army surveillance equipment to start monitoring americans. and we have reports from a group called surveillance technology oversight project that says the -- they think that represents an expansion of police power and surveillance. i have a few seconds, then i will ask a question to you. they point out the expansive nature of the surveillance state. how do you see this role of actually using military surveillance giving the equipment to law enforcement? how does that expand a surveillance state? >> in this instance, it is of grave concern because right now the capitol police are exempt from the freedom of information act. so not only would it be difficult under the rules of discovery for the defense counsel, you also would not be able to get transparency.
5:15 pm
it creates a very powerful surveillance state in an agency that probably is unaccustomed and does not have the history of being, you know, surveyors of united states citizen. so it is a recipe for disaster. >> thank you. my time is expired, mr. tiffany. >> i will yield back, but one final thing. i think we should put in the mix here what is happening with the text messages that they are talking about, enlisting phone companies to intercept sms messages. i think that is a very chilling thing and it should not be done by the biden administration, and we should throw that into the mix in dealing with this issue as we go forward. thank you, madame chairman. >> thank you. i know these issues reach beyond administrations and these are important questions this committee is bringing forward. it is my pleasure to yield to the gentlelady from pennsylvania for 5 minutes.
5:16 pm
>> thank you. thank you for convening this hearing on innovative technology and its appropriate use and regulation under the law. and i also thank all of our test of fires and advocates, testifiers and advocates today. mr. lee companies claim that , they can instantaneously identify people with unprecedented accuracy, yet mr. williams's story is not unique. such technology has resulted, as you know, in at least two false arrests in michigan and one in new jersey. organization such as aclu have expressed concern around the risk of that clearview's technology may have on undocumented immigrants, communities of color, survivors of domestic violence and other vulnerable people in communities.
5:17 pm
can you share with us some of the dangers of this facial recognition and its misuse? >> absolutely. thank you for the question. the issues, there are a number of civil rights and liberties concerns with private access to law enforcement. also with law enforcement's ability to purchase private data. there is little to no transparency, as we said. i think that a fundamental issue we need to talk about is that these technologies are not accurate. and i think that we are having a conversation as if they are. but they have significant error rates across the spectrum, especially from marginalized communities. with little oversight, with little guidance, with no community input, and no meaningful ways to engage in checks and balances, any conversation around the use of facial recognition technology in the criminal process is something that the leadership conference and many organizations push back against.
5:18 pm
we should be think about a way to envision a law enforcement and community engagement with law enforcement that does not require surveillance. constant surveillance and constant over policing. that is something to push forward for. >> thank you for that. transparency, recognition that this is not foolproof technology, and the error rate. there is a lot more we need to learn. mr. williams, of course we are all moved by your story. your daughter watched as you were arrested for a crime you did not commit. how did your arrest, detainment, affect your children, your family and community? >> thank you for the question. it mainly affected my older daughter.
5:19 pm
she other daughter was only two, so she did not know what was going on. but she cried, too. she cried because her sister was crying. like i said earlier, i was trying to tell them, i know we are short on time, but we thought about taking her to see a psychiatrist because every time she sees the story on tv, it makes her very emotional and she starts crying and we have to turn the tv off. and i am like, i am not going to jail every time you see it on tv, baby, they are just reporting the story. i wrote something down, somebody else asked the question. but the police actually did investigate it. and with all of those eyeballs on the pictures, nobody said these two guys do not look alike. i do not know how the human part goes into it but, do you have an explanation for
5:20 pm
that, did they give you an explanation? >> nobody told me anything. we basically had -- and i cannot say forced, but we asked over and over again for a proper, um, explanation or a way to say it was some type of, i do not know the word for it, but they just did not seem like they were sorry. they didn't want to be apologetic about it. they were like, this happens. we made a mistake. and i am like, you know, that doesn't do anything for me because all i did was come home from work and get arrested. and then there was other stuff like, i did not get dinner that night. they had already served food at the prison. so, i got to go sit in a cell
5:21 pm
with no food, no drinking water, and, you know, it was terrible for somebody. i was not ready for it. >> i have to say thank you for , sharing that. i asked about your children, but we have to realize what impact it has had on you, the long-term impact on you to come home after work and be arrested and detained so inhumanely and incorrectly. i see my time is done. >> i did not mean to cut you off. i do not know if you can hear, i have a small speech impediment at this moment, but last year i also suffered some strokes. they are looking into that to see what actually caused the strokes. i have no idea at this point,
5:22 pm
all they can say, i have one doctor say it could be attributed to stress. if he has a way to prove it, i do not know. but i did have strokes last year after. >> god bless you. i yield back. >> thank you. mr. williams's story should inspire all of us and others to find a bipartisan pathway forward. let me now yield five minutes to mr. massey. you are recognized for five minutes. >> thank you madame chairwoman for holding this very important hearing. before i jump into facial recognition, i want to talk about name recognition. imagine a system, a computerized system that instead of basing the adjudication of your civil rights on the distance between your eyes, or the shape of your nose, or where your ears are,
5:23 pm
it's based on the spelling of your first and last name and your birthday, and whether you might share a name with somebody who has been convicted of a felony or could be imprisoned today. such a system exists, and there is no judge or jury involved. it is called the nicks background check system. and the troubling thing about facial recognition is is there may be racial disparities in it, but we know that there are racial disparities in the instant background check system, the system you have to go through in order to purchase a firearm. my friend john watt, who worked at the doj recently has found evidence that this system, by a factor of 3 to 1, produces false denials for black males compared to white males, because they may share a similar name to somebody incarcerated within their ethnic group.
5:24 pm
and so, ms. goodwin, i appreciate the good work that the gao does. i served on the oversight committee and we covered this topic before there, facial recognition. the gao released a report that said there were 100,000 denials based on instant background checks and only 12 federal prosecutions. about 100,000 people committed a crime by trying to buy a firearm. what we know from that is they actually did not commit a crime, but there were a lot of false positives. i got the commitment of the fbi director, wray, to look into this issue about whether there are racial disparities built into the instant background check system. it's not very technologically advanced at all. he said he would work on it, but if you have an opportunity, you might look into it. is this something that you're
5:25 pm
-- your part of the gao would examine? >> actually, that was my report. i worked on that report mother prohibited purchases report. it so we are happy to have a conversation about additional work we can do in that area. rep. massie: thank you for that report. they do collect race information on the form, and so if we can just take a another level, it would not have to be as deep and thorough as the report that you did already, but it would be helpful, i think.
5:26 pm
i noticed that the beginning of this recent report on facial recognition started with, or your testimony, "we surveyed 42 federal agencies with law enforcement employees, and i thought, whoa -- that to me is troubling, that we have 42 federal agencies that have law enforcement. maybe this is something the democrats and republicans can get together on, maybe we have too many police forces at the federal level. but moving on, i wanted to ask one of our professors here, maybe professor lauren, about grading material. -- about brady material, possible exploratory evidence that's produced that the prosecution may not want to share with the defense. you mentioned the way to get around some of the typical safeguards is they say, oh, we just used it to identify the suspect, we are not using it for the conviction. can you talk about why that is problematic and whether defense attorneys can get brady material from facial recognition companies? >> thank you for the question.
5:27 pm
you know, in terms of -- i will start with the last piece, which is getting material from companies. in other words, there may be instances where assuming we are in a world we do not live in yet, where facial recognition is introduced in evidence in a criminal case. in those instances, the defense would certainly want to interrogate, right, what is inside the black box that generated this match. one of the complicating features here is the proprietary nature of many of these technologies. and so we have not seen this yet so much in this context, but we have seen it in the context of dna with certain more novel analyses of low copy dna, where the defense has been denied the ability to access the source code, access what is claimed by
5:28 pm
companies to be proprietary information protected by trade secrets, and that intellectual property law has been invoked to quash defense subpoenas. so one of the complicating features, one of the many limits on the ability of the defense to vet this in a criminal context is this proprietary dynamic. layered on top of that are a number of complications that derive from brady doctrine and sort of disagreement over the scope of brady and deciding when defendants are entitled to this material versus after-the-fact. i'm happy to pause there. >> my time is expired, but i hope congress will weigh in on this, because it does seem to be a problem, not just within facial recognition, but across all of the domains. i yield back.
5:29 pm
>> i now recognize the gentleman who has dealt with at some of these tech issues from a different perspective, the gentleman from rhode island. you're recognized for five minutes. >> thank you to the witnesses if -- for your compelling testimony. i want to begin with mr. williams. i'm wondering -- first of all, thank you for being here to talk about these issues in terms of public policy. it is good to be reminded that this impacts people's lives and your story is one example of that. did it seem like the police had even bothered to conduct an actual investigation, or did they appeared to be solely relying on what the computer told them about your arrest? mr. williams: thank you for the question. so, they did provide information.
5:30 pm
i did not find out any of this until after, when we went to court, but they had a lineup of, i guess, six guys. and we -- none of us look alike. -- looked alike. i guess that somebody picked me and said, this is the guy right here. like, i look nothing like the other guys. i was actually about 10 years older than all the other guys in the picture. and they did not use my current license photo, they used one from six years ago. >> thank you. are there specific privacy concerns related to data
5:31 pm
breaches of facial recognition systems that are more concerning than breaches of other systems used by the federal government? >> yes, thank you for the question. yes there are. as i mentioned, your face is permanent, distinctive. if that information is breached, it is more problematic and concerning than if your password is breached, because you can change your password, but for the most part you cannot change your face. so there are major privacy concerns about what it looks like when the technology, when that information has been breached. >> facial recognition systems have varying levels of accuracy, but we know that they are ineffective in identifying people accurately from communities of color and women. is there even any training that might help law enforcement overcome these inaccuracies, or are they so embedded in the system that we should ban facial recognition outright?
5:32 pm
>> thank you for the question. i think we need to take about a context of policing that does not involve surveillance or overly surveilling marginalized communities and over policing them. i would highlight that one of the things to think about is the technology does not, if the technology is not surveilling or if they technology is not working to the same communities it's trying to help, is this technology worth using in the first place? so that is something that we need to think about, particularly within the context of civil rights concerns. because, again, mr. williams was wrongfully arrested, wrongfully detained for a crime he did not commit, based off of a line up and set up and use of facial recognition technology that was not authorized by the local government, not known by mr.
5:33 pm
williams himself, was not understood under any circumstances by the people using it, and had varying levels of accuracy across the board, not given to mr. williams at the time, or to the law enforcement. >> i want to get one more question. so building on what was asked about earlier, if we could require prosecutors to inform a defendant that facial recognition technology was used to identify them in any part of the process, that is one reform. in addition, we could require that the prosecutors disclose any other exculpatory, statutorily -- with respect to brady. there are things we could do to at least make the process more transparent and hopefully, give the defendant an opportunity to challenge this technology. >> i agree, there are other items that could be added to the extent that the system to report out confidence levels, that's
5:34 pm
something that seems necessary to disclose, but i also think this issue of being able to get at that black box and actually know what is driving the results, which then involves questions around intellectual property, is really and important issue to get at. >> i want to say because the chair recognizes that part of this is facilitated by the monopoly power of these large technology platforms, the fact they are unregulated, they are collecting an enormous amount of surveillance, and the danger of integrating all of this presents real challenges, so it also requires us to regulate how private companies can provide this technology to law enforcement, and larger questions about how we have to rein in big tech. i look forward to more details on that. thank you. >> your time has expired. thank you, we have other members yet to ask questions, so we will
5:35 pm
certainly move so you can be included. it's now time to recognize mr. fitzgerald for five minutes. rep. fitzgerald: thank you. this is fascinating stuff this morning. i just wanted to move in a direction away from law enforcement to just the private sector, where some of the members touched on this morning. as we see iphones that use facial recognition, you know, anytime you are going through the airport now you see the clear system, which is allowing access for individuals in airports on a regular basis. and some employers now that are using this in relationship to their employees, under the auspices of security.
5:36 pm
so there is a lot of different things happening in the private sector. the question i have for ms. frederick and mr. tolman, can you address some of the privacy concerns of private sector use, particularly related to the difference between using facial recognition technology for verification purposes rather than just flat out identification? ms. frederick: ok, i will start here. i think generally that it is that outsourcing of surveillance that is more potential concrete violations of the fourth amendment. i'm not a lawyer, but when i see the surveillance state basically outsourcing this job the private companies that do not fall under the constitution, that is what starts to worry me as a citizen.
5:37 pm
as somebody who has been behind the powerful technologies employed at a commercial level, also by the surveillance state overseas, to me i think it is that outsourcing to private companies that we need to think about when it comes to the constitution because some reports indicate that that is expressly made to skirt constitutional protections. mr. tolman: that last point is where i wanted to go, often times at the use of third-party providers is a runaround of the protections. we've already seen one core indicate that the boston police department violated the fourth amendment in its use of facial recognition technology. that would be one pressure point to push you towards a utilization of a third-party provider. you know, at one end, where you have somewhat lower standards
5:38 pm
when utilizing the evidence from them. and we do not know the nature of some of those partnerships, we do not know -- that's the problem, the lack of transparency and we do not know what it is, sometimes even the questions we need to ask in order to find out how pervasive that relationship is. >> thank you. great answers. the other question i would have for both of you would be, there has been discussion about a -- a full out ban, and what we are seeing across the nation is some new municipalities, through ordinance, have been progressively addressing this issue as it emerges, either as part of something that is under their control -- many times it is, of a municipal court system. and they are moving forward. city council and even some
5:39 pm
county boards are moving forward. is it practical at this point? it does not seem like it is to me at this point. mr. tolman: i would quickly say that it is hard to imagine that this powerful tool is not to be utilized at all. i understand the calls for a ban, but that is what makes this difficult, there are circumstances in which this powerful technology may be the only thing that may save a victim or save a town from some horrific crime or from a terrorist attack, for example, so getting this right in how we authorize law enforcement to use it is right now the most important job, i think, that this congress has. ms. frederick: i prefer to think of it more as a tactical pause, take the time to get it right.
5:40 pm
so i.t. that there are measures with which we could do so. a lawsuit are being fought in the court. illinois has filed a lawsuit against the chicago police. there are other mechanisms, as well as municipalities, that can take the time to get it right. rep. fitzgerald: thank you. >> i thank the gentleman. i'm pleased to recognize the gentleman from california. you are recognized for five minutes. >> thank you. i'm super excited that you called this hearing. my office and i have been work and facial recognition for over two years. the reason it has taken so long is the chairwoman has aptly stated that it is a can of worms. it can be very complicated. we are working on legislation that is essentially complete now, it basically requires a warrant for most cases for the use of facial recognition
5:41 pm
technology, it sets auditing standards and requires recording. it prohibits the technology for most first amendment protected activities and in number of other provisions. we will circulate that legislation to the members of this committee for your review and comment. so, thank you for this hearing. my first question goes to professor laurin. as we know from the hearing, the facial technology implicates the first, fourth, and fifth amendments. but i want to talk about the equal protection clause, because there is a disparity in accuracy in terms of the color of your skin. would there also be an equal protection violation if we had governments deploy this annoying that -- knowing that people are being treated differently in terms of surveillance?
5:42 pm
mr. laurin: i think that in answering the question i want to sort of distinguish between constitutional doctrine, as it stands, and maybe a more intuitive sense of what is rightfully understood to be constitutionally problematic. i think the challenge with using the equal protection clause and doctrine around it, to get at these issues of bias in facial recognition, is existing supreme court doctrine requires that essentially discriminatory intent be proved in order for state action to be deemed a violation of the equal protection clause. and that presents a hurdle when one is working with evidence of disparate impact, right? there is this evidentiary area
5:43 pm
gap often and getting from the state realizes there is a disparity to the state wants to discriminate on the basis of race or sex. that has been the difficulty in using the courts to get at these problems of disparity. that does not mean it would not be understandably appropriate for congress to say, we see a problem in these disparities and want to legislate to require that the disparities in themselves be sufficient basis for limiting these technologies, etc. one needs that they can about what the hook would be further that in terms of section ive, or the -- five, or the commerce clause. but the existing doctrine presents a challenge because of the requirement of intent. >> thank you. professor friedman, first of all let me step back and say that i agree with what dr. alexander
5:44 pm
and mr. tolman said, that we cannot just ban this technology. you look at human history, it is nearly impossible to ban technology, but we can regulate it. because of so many different iterations and possible uses and situations for facial recognition technology, what do you think about imposing a warrant requirement that can take into account different factors before deciding whether in any specific case we would use a facial recognition technology? >> thank you for the question. i want to make a comment about the ban versus a strategic cause point. i think everybody is recognizing we have the cart in front of the horse, that is the problem, the regulation is the only shot at getting the horse back out in front. warrant requirements are useful in some circumstances, not
5:45 pm
others. it depends on the use case. for example, if the use case is identification, which is what most agencies are doing right now, they take a photo and compare it to a database, a warrant requirement makes sense, but only because there is a reasonable belief is you have a photo of some but he suspected of a crime permitted under the statute. i would only allow this for very serious crimes. that's a use of a warrant that makes sense, but the warrant cannot tell you that a particular individual did the crime, it is just the technology. >> i yield back. >> i think we have present with us the gentleman from utah, mr. burgess, he was recognized for five minutes.
5:46 pm
rep. burgess: thank you, thank you for holding this important hearing today concerning facial recognition. it demands a bipartisan solution and i look order to working with colleagues to address their concerns. i want to thank mr. williams today, we agree on the unfairness of what you and your family had to endure. thank you for doing your best so that no other american has to experience what you are going through. and i want to give a special welcome to my fellow utah resident, mr. tolman, who has been a strong advocate of criminal justice reform. and towards the end of the questioning, i'm finding that the upside is most of the questions have been answered, so i will yield my time back to the ranking member, but before i do and want to highlight what was said previously, the bipartisan
5:47 pm
concern i'm seeing as we go through the topic. and i also want to point out, as we look at solving the increase of crime that we see everywhere, that we cannot rely on technology. it is a tool to support human relationships, and human relationships should be based on respect. we cannot continue to demean and disrespect the good men and women policing our communities and expect technology to make up for it. i hope this committee will address the concerns of facial recognition technology, let's take the same passion to bring back the respect for are law enforcement officers, the men and women, the same respect i had for them that i experienced as a child. we will have more confidence in them once we have that accomplished. i will yield back my remaining time to mr. biggs.
5:48 pm
rep. biggs: a couple of quick points. this has been a very enlightening hearing. we have not really addressed my opinion enough, the question on a reasonable expectation of privacy. that's been shattered by the data scraping that goes on by private companies. i hope that we can expand our investigation into that, because i do not think that the question has been settled through the courts yet, either. if it has, i have not seen it, but we certainly need to get to that. i also want to comment on the legislation from mr. lieu. i look forward to reading it and i appreciate your efforts on it, and i hope we can integrate the work done in the previous iteration of the ogr committee with the chairman cummings.
5:49 pm
so i look forward to seeing that and hope you can get it to us soon, mr. lieu. and i want to ask mr. tolman this question because it goes to the heart of what we are talking about -- i mean, as said, this is a can of worms when you open it up. when we start bringing together other technologies, we have discussed this with sms, and we use this from mere identification, which in itself is a problem, to potentially track movement, this notion of science-fiction type pre-crime, stopping pre-crime -- what's your take on that in a brief way, that we are on a spectrum that will keep moving to trying to stop people from committing crimes before a crime is committed by using this technology and others combined?
5:50 pm
>> i share the concern. when we have seen law enforcement given additional technology, their first instinct is to utilize it because they have a sincere desire to work on behalf of our communities and the victims of crime. however, they are often the last ones to ask the question, should we utilize this technology. that has to be the discussion of members of the legislature that identifies this. we are already seeing that speech in -- you know, social distancing under covid, in areas where we have not been are now areas where we are focusing law enforcement technology in action. because of that we are on a slippery slope and i would share your concern. rep. biggs: i hate to cut you off, but i want to finish where we started with the chair -- and the testimony from mr. williams.
5:51 pm
so, let me find it. mr. williams, in your written testimony you said, "i do not want anyone to walk away thinking that if only they technology was made more accurate the problems would be solved." i think the issues we have discussed are not just issues with the accuracy of the technology, i think we are talking about a total invasion of privacy, which is absolutely unacceptable, which at some point will give too much power to the state. and we have seen it already in your case, mr. williams, and in others. i am looking forward to working with the chair, and also mr. lieu, and others who have a sincere interest in resolving this issue. thank you. i yield back. >> you are absolutely right, we have a sincere interest in resolving this issue and if
5:52 pm
there are layers of issues i think that we can respond to as a committee in a bipartisan manner. and thinking of bipartisanship, i'm delighted to yield to the gentleman from california, a former sheriff, for five minutes. >> thank you for holding this important hearing, and i want to thank all of the witness today for their testimony. mr. williams, i want to say that hearing your story, being in a cell for 30 hours and not knowing why or what you were accused of is alarming and tragic at many levels. as you think about the -- today, when you give your fingerprints, you give permission to have them taken. when you go for a scan, you say, here are my prints. when you give your dna, you give it with permission, knowing that
5:53 pm
somebody is going to do a dna scan on you. but when it comes to facial recognition, no consensus is necessary. in fact, most people do not even know that their information is being collected by somebody, a private entity out there. it was mentioned that 133 million americans -- let me repeat that, 133 million americans have their facial recognition data in a database somewhere in this country. and i presume most, if not all, give that without consent or even knowing that that data was taken from them. so, professor laurin and professor friedman, i have a question -- we discussed the constitution, our rights, but
5:54 pm
are there any regulations, laws that govern to limit in any way possible a private entity's ability to collect facial recognition data? as a taxpayer or citizen, do i have to accept the fact a private firm has my facial recognition data to sell for use as they wish? professor friedman: if i may? >> go ahead. professor friedman: i want to appreciate the question at a couple levels, because it is important. it is easy to fall into the language of constitutional rights, and those are critically important, but the interpretation of the constitution often does not touch the things we are talking about, in part because the technology is rushing ahead of the courts. you could not be more correct that there is an urgent need for legislation and regulation, as
5:55 pm
to private companies and law enforcement agencies. to the private companies, i do not know of any congressional law that deals with this. some states have stepped in like illinois. for the most part, there is not. and i think the need for legislation is essential. professor laurin: i can only echo what the professor said, he speaks my mind. i think there is a tendency, actually, to sort of overly prioritized the constitutional stuff and to miss the fact that actually the constitution is a floor, not a ceiling, in terms of protections. and often the constitutional protection is lower it design -- lower by design because the courts say there are details for congress to figure out and it is really not our institutional
5:56 pm
place. >> let me repeat my question. as a citizen, will i have to accept the fact that right now a private third party has my facial recognition information and they can do whatever they want to do with it, sell the information to whomever they want to sell it to without repercussions, without any regulation, without violating any kind of a law? is that the state of the country today? professor laurin: the best answer i can give you, i do think that there are at least some data privacy regimes, statutory regimes, that will limit summative days. -- would limit some activities. but in the main there is little regulation currently, and certainly the constitution has virtually nothing to say about private, nonstate action.
5:57 pm
at that is why congressional action is so important. >> so i am left here with a very unsettling feeling that i essentially have no remedy for somebody using my facial data recognition information. for whatever they want to do with it. >> you can create one. congress has the power. you can change that state of affairs, if you chose to. >> thank you, madam chair. i ran out of time. i yield back. >> the individual ran out of time. we're also running out of time. this is such an important issue and i am grateful for all of my members. and i listened to, as i yield for a moment, i listened to professor friedman over and over about regulation, and that is what we are having this hearing
5:58 pm
for, because we are working on legislation, and i look forward to working with mr. biggs, but also with our colleagues who are likewise working on legislation to address the question about regulation or a construct that needs to address this question. so, thank you all very much. it is my pleasure to yield it to the vice chair of this subcommittee from missouri, the congresswoman bush for five minutes. rep. bush: thank you for convening the hearing, madame chairman. we have spent significant time addressing the brazen and broad scope of power of our law enforcement agencies, from the fbi to prosecutorial agencies under the department of justice, and time and time again we are met with law enforcement practices that not only under
5:59 pm
mine our civil liberties, but our basic principles of human dignity and security. 28 members of congress were falsely matched in a mug shot database. this was members of congress, public officials. imagine what it is like for everyday people across the country. what we know of this technology is the darker your skin tone, the more likely you will be falsely identified. i recall the response to the tear gas. it was during that time we realized we needed to where bandanas because of the likely chance of facial recognition technology being used. the gao found a 14 agencies use
6:00 pm
facial recognition technology. yes or no, was not used by federal law-enforcement agencies during the protests on the torture and murder of george floyd? >> yes. >> what collateral consequences are there of protesters being identified by these technologies? >> we raised concerns about inaccuracies. the concern is people would be misidentified by the technology, and that could have adverse consequences. a person's face could be out there and used in a number of ways. our focus was to look at the lay of the land for how law-enforcement was using the technology. >> thank you. in 2016, it was reported police used facial recognition to
6:01 pm
surveil crowds during the freddie gray protests without a warrant. is there anything to protect those identities while engaging in the first amendment? >> thank you for the question. this would be in our mind a violation of the first amendment, but also there are serious privacy end civil liberties concerns with the technology that we saw not only with the george floyd protests but also with lapd, using bring -- ring cameras to surveil black lives matter protesters. we have serious concerns. there are serious civil rights and civil liberties concerns about the tracking of protesters. especially in these circumstances and especially when the technology is not well used or very accurate. >> mr. lee, how does the use of the facial recognition technology -- how does that
6:02 pm
persuade the opinion of law enforcement and high tech? >> there's a tenuous relationship between communities that are over policed and over surveilled by law enforcement. we have seen that time and time again. congress needs to keep that in mind when regulating facial recognition technologies. these are tools used by law enforcement to disproportionately target black and brown people across the nation. we need to keep that in mind. the civil liberties of those people in mind, as we think of what the next steps are. we have asked for a ban or moratorium. we think it kills protests and speech. it miss identifies, and it does not work. >> thank you so much, mr. lee. there are local solutions that are being [inaudible] [no audio]
6:03 pm
>> ms. bush? are we able to get ms. bush connected?
6:04 pm
i think we may have lost congresswoman bush. thank you very much, congresswoman, for those very insightful points that you have raised. i have been impressed by the direction of our questions, the depth of our questions, and certainly the importance of our witnesses. i have questions i'm going to utilize at this time, but i'm happy to yield. you have been rewarded with time. i am happy to yield to you before i make my final inquiries, that i hope will be important to the record of this hearing and proceeding. >> thank you, madam chair. i know you will be surprised to hear this. even though i have many more questions and comments about this, i appreciate those who
6:05 pm
have participated. but i'm going to limit my comments right there. again, thank the members, the witnesses, and yourself. thank you, madam chair. i yield back. >> thank you very much. and thank you to the witnesses. i am going to proceed with some questions that i think -- that i did not proceed with. as i do so, i want to make sure that i thank all of the witnesses again. you will probably hear me thank the witness has more than once. i would like to submit into the record, then i will begin some questions that i think a very -- are very important. i want to submit into the record the geo report that i believe was entitled "facial recognition technology, federal law enforcement agencies should have better awareness of systems used by employees."
6:06 pm
i think that is shocking, and i think that is also indicative of what i think dr. alexander made mention of, and terms of the numbers of law enforcement and the utilization of this tool without training. i will, mr. lee, come to you for a broad question on civil rights. but let me ask dr. alexander, 18,000 police departments, one of the reasons why we had a push in the george floyd reform and policing act to find regular order that will help police not diminish police, how do you respond to the idea that -- and also, to my members, i believe we should have a follow-up hearing with some of our federal law enforcement representatives cited by this excellent report put forth.
6:07 pm
that may be in a classified manner. we will be presenting that in the future. dr. alexander, to note that it was determined through the federal system that employees were using this system without their hierarchy knowing, what do you think that means for the 18,000 police departments, if this technology comes into their possession, and how that relates to your point about police/community relationships, but also victims -- and i will call a victim, and that is mr. williams and his family, the victim, that he can be made whole. >> thank you, madam chair. yeah, there are 18,000 police departments in this country, and often times, we are finding more and more, we've got 18,000 different ways things are being done.
6:08 pm
and how they are being responded to, how they are being disciplined, how they are being managed, how they are being trained, etc. when you have one organization, and all it takes is just one, quite frankly, the american public see policing the same -- does it make it right? no, but that's the reality of it. if i do something horrible, we'll have to take that hit, too. to your question, it becomes important to recognize that training will have to be regulated at a federal level and there has to be some federal standard. the same way we do with fingerprints, with dna, with other forensic methods, that have proven to be of value in the law-enforcement and public safety community.
6:09 pm
because the whole idea about this particular piece of technology is that it is very convoluted, and it is very complicated, and there are going to be cases that are going to go before the courts that we have not even thought about yet, as it relates to this technology, still yet to be examined. i just think, congresswoman, that due diligence would be given to these local agencies across this country, if they have an opportunity to be well trained, if there is standardization for utilization of this technology, if the ongoing certification -- there be practices held to the highest standards of science. in addition to that, i think it becomes important as well, too, that the public understands what
6:10 pm
facial recognition technology is. because we've talked about it a lot within the context of policing and within congress, but a lot of people in our communities across this country do not understand exactly what it means. and when they do here about the -- hear about the cases, they hear about the cases that are often associated with negatives, mr. william. >> you don't have to answer, i'm going to make this based upon your response -- minimally, police departments need to be aware of when the technology is in their possession, and the individuals given the authority of using it, i would like to place that on the record, because the title of the report said that our federal agencies should have better awareness of these being used by employees. that is shocking and amazing to me.
6:11 pm
i think that can be parlayed out into the field with the other 18,000 police apartments. but thank you for your answer. i indicated my concern has been the unreasonable search and seizure. i think this was raised by one of our members. i would like a succinct answer from both of you. i don't think we are without tools. we have the ability to look at the federal criminal procedure as we do federal procedure, but -- as we do civil procedure. in any event, as a former u.s. attorney, in cases where an investigation result in prosecution, should prosecutors be required to inform the defendant that facial recognition technology was used to identify them in any part of the progress? -- of the process? it is investigative, and should positive matches be required to be produced, as that investigation goes forward?
6:12 pm
if it ultimately comes into court. i know one of our members mentioned the brady difference. -- the brady defense. how should we respond to that? >> thank you, chairwoman. my bottom line view is it is absolutely right, the observation that with respect to federal criminal procedure, there are some very direct interventions that congress can make. while the scope of rule 16 frankly is always sort of a fraught discussion, i think there are important limited interventions that can be made in terms of requiring access to facial recognition. the fact of the search, to alternative matches, to information on confidence levels. as i said, i think to the extent possible, it is essential the defense has access to the
6:13 pm
algorithms, to what is inside the black box to be able to evaluate the reliability of the technology. and that can all be a matter of statutory change, without having to worry about the court sorting through whether it encompasses that. >> the professor is absolutely correct. i agree with everything she just articulated. i would add to that, unlike a lineup, where a prosecutor is required to hand over alternative individuals who might been placed in the lineup, and the rationale for the lineup that was utilized, facial recognition technology requires the prosecutor and should require the prosecutor to turn over more if the intention is to give the defense counsel all that it needs to assess the
6:14 pm
reliability and admissibility of such evidence. we are in a new ballgame with this technology. it will require more to be given from a prosecutor to a defense attorney under the strictures of constitutional protection. >> one additional note, it is essential that timing be addressed. because if defense counsel is only able to access this on the eve of trial, it does no good in terms of evaluating whether a guilty plea is inappropriate, no good in admissibility of evidence. early discovery is important as well. >> early discovery and early knowledge, is that my understanding? thank you so very much. let me quickly -- professor friedman, i want to clarify, when you said regulation, and what framework are you speaking? >> i think what is important here is comprehensive regulation. that does two general things, regulate the technology, and the
6:15 pm
other is, there's a lot we need to learn. one of the things congress can do immediately to actually test this technology under operational conditions, how law-enforcement is using it. because there are very serious concerns about accuracy and bias, particularly racial bias. particularly racial bias. we get numbers reported, but those numbers are no reflection to how law-enforcement is actually using the technology and what little we know from this suggests those error rates may be quite high. there's a list of things that i think congress should studied. there are a list of immediate safeguards we could put in place. for example, best practice protocols about what it means to have a human in the loop to verify what the algorithm does. >> thank you very much. i'm going to finish with mr.
6:16 pm
lee. you have been keen on the providers -- or the tech entities that control this technology. from your perspective, what would be a regulation, in that entity, in that arena? >> i work for the heritage foundation. we are about limited government. we do not really support regulation or robust regulation when it comes to private entities. my solution would be we have to encourage these companies, the leaders in these companies to institute those privacy by design mechanisms. make sure in the design phase of creating an algorithm you are doing it with privacy protections. we should have methods of encryption, which are getting more powerful and better. decentralized models of storage. machine learning.
6:17 pm
if u.s. congress can sort of impress upon these companies that they need to build their data privacy protection, i think that is a good start point, short of actual regulation on something. >> in the spirit of my ranking member, you know this is a bipartisan hearing, we wanted to make sure we welcome your boys, as well. -- your thoughts as well. those will be very helpful. let me have mr. williams, then mr. lee, you have the last word. mr. williams, i think you have heard from all of us, we are appalled at what happened to you. we are stunned. but we know that there are good intentions, i will be introducing an article where
6:18 pm
this technology was used. we also heard from dr. alexander, that the policing out there, working to do positive things. you fell into a very bad trap. what do you want to leave us with, not as a victim of yourself, but as a victim of this technology that should not have happened? i started out by apologizing to you and your family, and will be eagerly watching your case, possibly pursued by the aclu, if i am correct. what do you want us to know as you end your testimony here? you are on mute. >> thank you. first of all, i want to say thank you for inviting me to come and tell my story. and i hope that going forward, we can figure out a way to -- like they said, to regulate it. it is already being used, so we need to get it fixed so that it works in a way that it works properly.
6:19 pm
because in the current state, it is wrong. they get it wrong even when they get it right. i don't if that makes sense. like, you don't really have a [indiscernible] nobody is ever signed up to say, hey, they used my picture from my id. like i said earlier, they didn't even use my current driver's license, they used one previous, from what i might've looked younger at that point. i don't know. i guess all your information is in the system, so it is available to use. i was asked a few times about my
6:20 pm
daughter and how she took it. i do not know what the lasting effect is. at this point, she is upset every time we look at it or see it. i don't know. i'm just -- i hate to say i'm thankful there was a felony larceny. but what if the crime was capital murder, or something like that? and then they came to my house and arrested me? i don't know if i would've gotten a bond to get out and be free to even talk to you at this point. because the court system is so -- is so backed up, they probably wouldn't even have
6:21 pm
gotten to me yet and i would still be locked up for something i did not do. right? i mean, i kind of cringe to think about it like that, because i did not get to say anything. i didn't have any rights read to me or anything. they showed up asking my name, and i asked for a warrant. they didn't bring one. i kind of resisted arrest, but i was in my driveway. i was like, what are you all doing? he was like, are you mr. williams? you are under arrest. i was like, for what? he was like, i can't tell you. you cannot tell me what i am under arrest for, but i am under arrest. he took me immediately down to
6:22 pm
the detention center and i got fingerprinted and mug shotted right then and there. i asked him if my fingerprints were clear, i would be out. but that didn't mean anything. my fingerprints went in, so what? now they have a mug shot to add to it. it was so backwards, if that makes sense. i just feel like i am a regular. i thought i knew the law. and i was wrong. hopefully we get some change. >> mr. williams, i would just simply say you are seeking justice. and you deserve it, you really do. mr. lee, you represent the family of those who have been fighting these issues are dealing with the criminal justice system. having listened to mr. williams,
6:23 pm
i know the position of the moratorium, but just, if you would, end us on the enormity of the problem, if we were to have facial recognition used randomly in this massive entity of 18,000 police departments. what can happen to a person who was lined up like mr. williams? i am shocked that you can't get a reason why you are being arrested. i thought you were entitled to that. but they used facial recognition. it didn't look like him when they put the picture up to him. so how is that undermining of civil rights and civil liberties in this country, if used as it was used with mr. williams?
6:24 pm
>> madam chair, yes, i think there is a broader context of policing we must keep in mind that is especially instructive in talking about the case of mr. williams, but also why many in the leadership congress in this -- leadership conference in this coalition support a ban moratorium. we can't separate the tools from the context of the american policing system. we have evidence of that system over decades, centuries, of how black bodies and disproportionately black people are treated within the criminal legal justice system. we can't separate those tools. if you add the history of that criminal legal system to a tool known in the best of conditions biased against black people,
6:25 pm
so many members of the community we need to protect because of their protected status, and the history of harm that has been done on to their bodies throughout centuries, then we have to really have a real conversation about whether this technology actually works the way that we think it does. and it doesn't. that proves time and time again whether we are talking about the research or the aclu use of the facial recognition technology used to identify members of congress who are public figures. communities that are over policed and over surveilled. and it is least likely to work for those communities. and the communities not involved in the process. there's no public hearing about widespread use. congress has not been asked or has legislated on these topics.
6:26 pm
this is compounded by the opacity of policing practices in general. and law enforcement agencies. we still do not know how law enforcement is using this technology or the full extent of it writ large. community engagement is not a checkbox. it is a continuous conversation. and no one knows. there are secret hearings, classified briefings, but the public does not know the full extent of facial recognition technology. and probably the use of it by law enforcement writ large. -- and broadly the use of it by law enforcement writ large. we will continue to consistently call for a moratorium ban on these technologies and we would be more than happy to work with this committee and congress to imagine a solution and imagine a world of policing that puts the civil rights and civil liberties of marginalized communities first, not last. and not using these same
6:27 pm
communities as basically test cases when these technologies do not work. >> ok. thank you so very much. thank you so very much. thank you to each of the witnesses for their profound testimony. and thank you to my members, or the members of this committee, and ranking member, thank you for joining me. on, i believe, a hearing that will be very constructive going forward, as evidenced by the questions and responses and the expert witnesses we had today. this concludes today's hearing. i would like to submit into the record the problem of bias in facial recognition. may 1, 2020. washington post article how facial recognition helped
6:28 pm
capture members of the capital mob, dated april 2, 2021. the washington post responding to the study. i want those articles submitted into the record. i thank our witnesses for their attendance and participation. we thank all those who have helped prepare the hearing. without objection, all members will have five legislative days to submit additional written questions for the witness or additional materials for the reco >> c-span's "washington journal." everyday, we're taking your calls live on the air on news of the day, and we'll discuss policy issues that impact you. coming up sunday morning, we'll talk about the britney spears case with a clinical professor. also, we'll look at the political unrest in cuba and haiti and how it could affect u.s. policy.
6:29 pm
watch c-span's "washington journal" live at 7:00 eastern sunday morning. and be sure to join the discussion with your phone calls, facebook comments, text messages, and tweets. ♪ >> c-span is your unfiltered view of government. we're funded by these television companies and more, including comcast. >> you think this is just a community center? no, it's way more than that. >> comcast is partnering with thousands of community centers so students from low income families can get the tools they need to be ready for anything. >> comcast supports c-span as a public service, along with these other television providers, giving you a front row seat to democracy. ♪ peter: sujit raman is a former associate deputy attorney general in the trump administration. and he's our guest this week on


info Stream Only

Uploaded by TV Archive on