Skip to main content

tv   Hearing on Police Use of Facial Recognition Technology  CSPAN  July 13, 2021 10:02am-1:31pm EDT

10:02 am
don't require membership and i don't have it and i just use it for my eloquist which is a brand name drug approved by medicare. i've been taking psychotropic drugs for almost 30 years and i find there is a vast difference over the years between the generics and the brand names. the active ingredients in brand names are the same but the generics are different. host: we want to say -- to take the time to say thank you. thanks so much for coming on to talk about your study. guest: thank you so much for having me. host: now we take you live to the house judiciary committing just committed with that hearing
10:03 am
underway. [captions copyright national cable satellite corp. 2021] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit ncicap.org] >> you may unmute yourself any time seek recognition. i will now recognize myself for an opening statement. as i said, good morning and welcome to a very important hearing, one that i hope will generate not only an understanding but as i will ask a number of witnesses, legislative fixes that may be relevant and important in this rise of technology against a document that has been a living document of a century and that is the constitution of the united states of america. like many other leaps forward in technology, facial recognition offers our society promise and peril. proponents extol the potential benefits of policing, greater certainty in law enforcement, safer orders and ultimately come a more efficient criminal justice system. at the same time, facial
10:04 am
recognition systems have clear potential for misuse, having found to be inaccurate and -- in identifying certain groups of people. the technology and how it has been used in prosecutions raises constitutional concerns that call into question the basic tenets of due process underpinning all criminal prosecutions. the consequences indicated by these darker examinations require closer examination. congress has found itself at a disadvantage in understanding this powerful tool because the information on how law enforcement agencies have adopted facial recognition technology remains underreported or nonexistent. today, we are beginning facial recognition -- technology out of the shadows and into the light.
10:05 am
this dialogue must start the simple question of whether technology is sufficiently accurate to justify its use by the police and if the risk of depriving someone of their liberty unjustly are too great. to at untested facial recognition technology to our policing would only serve to exacerbate the issue still plaguing urge terminal justice system. large-scale adoption of this technology would inject further in equity into a system at a time when we should be moving to make the criminal justice system more equitable for americans and we are working very hard to pass the george floyd policing act. it was passed out of this committee many months ago. there are many unknowns but we can be certain that most if not
10:06 am
all facial recognition systems are less accurate for people of color and women and for the most part, we can be confident the darker the skin tone, the error rate. error rates have been up to 34% higher for darker skinned women than lighter skinned men. it is not just sometimes wrong, it can be wrong up to a third of the time. additionally, a 2019 study by the national institute of standards and technology found empirical evidence that most facial recognition algorithms exhibit demographic differentials that can worsen their accuracy based on a person's age, gender or race. according to the study, systems vary widely and depending on the algorithms and types of search. america is a multicultural, multiethnic nation. this is particular important as
10:07 am
it relates to these findings. asian and african-american people were up to 100 times more likely to be misidentified than white men. native americans had the highest false positive rate of all ethnicities and women were more likely to be falsely identified than men. the elderly and children were likely to be misidentified then owes -- then those in other age groups and it plays a strong role in trying to find missing persons. middle-aged white men generally benefited from the highest accuracy rate. in spite of these flaws, facial recognition technology is being rolled out across the country and has proven valuable to law enforcement in many instances from the large to the small, facial recognition systems are being quietly incorporated into american policing. you have small departments in the united states that may not
10:08 am
have the training and are using these technologies. we have heard the success stories like it being used to identify the shooter and the tragic capital gazette massacre and help identify victims of sex trafficking by max just by maxing -- by matching missing persons photos. facial recognition technology has been used to identify several of the insurrectionists on january 6. while we must applaud the use of new technology for the apprehension of domestic terrorist, this must be weighed against the fact that many of the system privately or government owned operate with minimal transparency, limited oversight and faulty informational security. 18,000 different police of departments across america of varying capability, it is noted that they are using technology
10:09 am
with minimal transparency and limited oversight and questionable security. individuals are testing this private software without the knowledge of -- or approval of their agencies. many communities find facial recognition -- systems and in their neighborhood. it must involve community consideration, technological evaluation and sober legal analysis. these trends have developed in the garden -- and the government has been largely absent. what we don't know powers what we do in the needs to be change. little thought has been put into the widespread adoption of a technology that can materially offer a benefit to law enforcement. this hearing is not meant to prejudge facial recognition technology words value to law enforcement but rather it is an important first step to proactively engage with this technology and adopt and adapt
10:10 am
our laws accordingly. as i indicated, it is important that we also work on productive and construction legislation is needed. i look forward to engaging with the witnesses about the benefits and pick walls inherent in facial recognition technology and forging a way forward toward transparency and accountability in its use. it is now my pleasure to recognize the ranking member of the committee, the subcommittee, the gentle man from arizona for his opening statement. >> i thank you for yielding time to me and i thank you for holding this very important hearing on facial recognition technology and its use by law enforcement. there is so much of what you just said that i can associate myself with step there are some things that i think we can work together on a bipartisan basis
10:11 am
hopefully, all of the members of the united states congress. facial recognition technology is a powerful tool when used properly. when used improperly and relied upon too heavily, it raises serious concerns and has some real-world negative consequences. today, we will hear from a witness who was detained by the detroit police department. facial recognition technology is not perfect and honestly it's far from it. in order to be reliable, images must be captured in the most favorable conditions. facial recognition algorithms perform poorly in the faces of women, people of color, elderly and children. i am also concerned about the potential of first and fourth admin meant corrosion.
10:12 am
law enforcement agencies could potentially use the system for the surveillance of individuals not involved in any suspicious activity whatsoever. moreover, federal law enforcement agencies with assistance from the state and local law oarsman partners have access to images of an overwhelming number of otherwise law-abiding citizens. state motor vehicle agencies possess high quality photographs of most citizens that can be used in facial recognition programs and it could be per -- combined with public surveillance. federal law enforcement agencies are also not limited to facial recognition technology systems they control are those controlled by other partners at the state and local level. federal law enforcement can use nongovernment facial recognition to providers like mission solutions. law enforcement officers with a clear view ai account can upload
10:13 am
a photo of an unknown individual. the system return search results and shows the potential photos of the unknown individual as well as links to the fight -- to the site where the photos were obtained. the use of the technology is largely unregulated and raises numerous concerns. the government could use facial recognition technology to monitor or target people exercising their first amendment rights including freedom of association. facial recognition technology poses privacy concerns and improper use can violate the first and fourth amendment rights. with these constitutional concerns with this technology, the gao issued a report that said not only had the law enforcement agencies not assessed privacy and other risks but some of them don't know which systems their agents are using. i think this is a real
10:14 am
opportunity for us to gain some important information and knowledge today. i have enormous concerns that i will summarize. technology, the technology here is problematic and inconsistent. images of law-abiding people have been acquired in the security of the databases is an issue after we've seen with recent breaches of databases. private companies acquiring and scraping their information and selling it to the feds, manipulation can take place. it expands the surveillance state we have been railing against. the gao gave important concerns. those -- that's a brief summary of some of these things that i'm concerned with.
10:15 am
i think we can work to find common ground. if we are talking about nationalizing police forces, i don't think we will get there but if we are finding some kind of meaningful regulation and oversight of the implementation of facial recognition technology which is what the chair is alluding to, then i think we can find a lot of common ground here. with that, i would like to submit for the record a letter to the committee dated july 12, 2021 from matthew feeney of the cato institute. >> you want to finish the explanation? >> i also wish to submit an article from american military news from july of 2021 regarding the capitol police adoption of the army surveillance gear to identify americans and recognize threats. i also wish to include several
10:16 am
articles about the alaskan couple who through facial technology that was a rooney us -- that was erroneous. >> without objection, so ordered. i thank the ranking member of the subcommittee for his presentation and his opening statement and i do believe, as i indicated, we should be engaged in oversight and legislative response and i would like to find common ground with our colleagues. i think this is important in terms of law enforcement in new utilization thereof but also for the constitutional principles in which we all believe. it's now my pleasure to record highs the distinguished chairman of the full committee, the
10:17 am
gentleman from new york, chairman nadler for his opening statement. >> thank you for convening this hearing. facial recognition technology is now able to capture images and analyze it in ways that would have been impossible only a decade ago. in some cases, the technology enables the real-time surveillance and of individuals in a crowd. his prolific dated -- its proliferating a manner largely unchecked by congress. there is attention we must address. this technology is now a commonplace picture in our lives. on the other hand, most americans have little understanding of how the police use facial recognition technology to conduct surveillance of communities across the country for better or
10:18 am
for worse. thanks to the work of the government accountability office, we have our first glimpse into federal law enforcement on facial recognition technology. it's recent study, they found 20 of the 42 federal law enforcement agencies surveyed you some form of facial recognition technology. this report leaves many unanswered questions about how the technology is used, when it is used and what type of mechanisms are in place to guard against abuse or protect sensitive data. for example, from the 14 federal agencies, only one agency could provide a complete information about what systems their employees are using in the field. we have received several disturbing reports of the misapplication of facial recognition technology by local police departments.
10:19 am
when attempting to match women and people of color. in some cases, facial recognition is deployed without proper consideration of due process rights. one witness today will testify to the miss under the circumstances and what effect it can have an individual's life. he deserved -- he deserves better from a law enforcement agencies entrusted to use a technology we all know is less accurate and applied to citizens that look like him. facial recognition technology can be in important crime-fighting tool. like all such tools, we need to ensure that the adoption of this new technology does not further erode trust in law enforcement and the communities they serve and does not compound the unfairness of a criminal justice system that too often impacts people of color disproportionately. the subcommittee will ask the critical question --
10:20 am
the hearing will not answer every question about the adoption of facial recognition technology by law oarsmen. my hope is that we will take another important step to balancing the needs of them -- of law enforcement agencies in the priority of keeping americans safe and crime and the concerns about this technology expressed by communities across this country. our task is not an easy one. we want in america were every individual knows they and their family are safe from harm while their fundamental rights are protected. we want in america were law enforcement has the tools it needs to protect and serve but with clear guidelines and rules that prevent those rules from being abused. understanding how to make that real in a moment where technology keeps changing and evolving will take a tremendous amount of work but this is worth doing. i thank the chairwoman for
10:21 am
calling this hearing. i look forward to hearing from our witnesses, especially bertram lee who once worked in my office. it's good to see you and i look forward to your testimony. i thank all of the witnesses were taking the time to engage with us on this important topic i look forward to the testimony and i yield back the balance of my time. >> i thank the chairman very much for his presentation. now it's my pleasure to recognize the distinguished ranking member, the gentleman from ohio, mr. jordan for his opening statement. >> i look forward to days hearing on facial recognition technology and i hope we can approach this issue in a truly bipartisan manner. last congress, we look -- we work closely with their colleagues on this issue. we had many vigorous to braids
10:22 am
-- debates, facial technology was one issue where we shared common ground. we held three by partisan hearings in the oversight committee on this technology. we learned many important facts about facial recognition technology. there are serious first amendment and fourth amendment concerns. there are also due process concerns about how law enforcement uses the technology. we learned that over 20 states, department of motor vehicles, have handed over their drivers license dated to the. no american citizen gave their consent to have their data and images turned over, no elected official voted to allow this to happen. every american should be troubled by this, i know i am i hope my colleagues on this committee are as well. the gao issued a new report about how federal law enforcement uses frt. this makes clear the federal law enforcement agencies have not even assessed the risk when
10:23 am
using this technology. if they haven't assess the risk, they are not taking them into consideration when the using the technology. the gao survey federal law enforcement on their use of the technology and 14 agencies that reported using the technology support criminal investigations also reported using systems owned by nonfederal entities. only one agency had awareness of what nonfederal systems are used. not only are federal law enforcement agencies not accepting the risk of using facial recognition technology, most don't know which systems their employees are using. it's difficult for a federal agent to assess using a system if they did not know which system they are using. i look forward to learning more and listening to our witnesses. i yield back. >> i thank the gentle man his opening statement stop without
10:24 am
objection, all other opening statements of our members will be included in the record. introduce today's witnesses let me thank them for their participation. this is a virtual hearing that is no less crucial and important and we thank you for participating in that manner of technology. the first witness gret good whena leads the geo -- the gao work. it includes federal law enforcement and oversight and training, civil liberties and civil rights, vulnerable populations, the federal prison system in the federal judiciary. director goodwin has a phd and masters degree in economics.
10:25 am
glad to see there is a houston connection. professor barry freeman served as a faculty director for the leasing project that new york university school of law stuff he is a professor of law and if his -- an affiliated professor of politics. he has litigated and written about constitutional law, the federal courts and policing for over 30 years and has produced numerous books -- numerous books and articles. he graduated with honors from the university of chicago and received his law degree from georgetown university law center, welcome. robert williams lives in farmington hills, michigan with his wife and two young daughters. he works for a logistics planner in the automotive industry. he was wrongfully arrested by the detroit police department in
10:26 am
2020 when facial recognition technology misidentified him. he was apologize to them was grateful -- and we are grateful to the testimony you present today. . virgil lee is a counsel for media and technology. he works to advance the interest of multiuse communities. his portfolio includes broadband access, surveillance technology. also artificial intelligence. he previously worked as policy council at public knowledge in the u.s. committee for education and labor. he received his jp from howard university's will of law. welcome. we also have a research fellow for the center of technology policy at the heritage
10:27 am
foundation, her research post -- focuses on big tech and policy. she was a fellow at the center for new american security and served as a senior intelligence analyst for a u.s. naval command and helped create facebook's global security and counterterrorism program. she received her m&a from king's college london. welcome. professor jennifer lawrence served as a professor at the university of texas school of lower she studied and writes about law and institutional design shaping the function of criminal justice institutions. she currently reports to the american bar association on their standards tax force and is a former chair of the capital punishment assessment team.
10:28 am
she received her undergraduate degree in politics and her jd from columbia law school, welcome to you. rhett coleman is the founder of the tolman group and previously served as a shareholder at a law firm. previously, he also served as a u.s. attorney for the district of utah and was selected by the attorney general to serve as special advisor to the attorney general on national/international policy issues affecting united states attorneys and the department of justice. he also served as legal counsel to the united states senate judiciary committee. he received his ba in jd from brigham young university. welcome. dr. cedric alexander is a law enforcement expert with over 40 years of experience in public safety. he has appeared on national
10:29 am
media networks to provide comments on police net -- police communities and has written numerous editorials and books on the subject. previously served as deputy commissioner of u.s. state division and chief of police in dekalb county george and his national president of the national organization of black law enforcement executives, powerful law enforcement organization. he was appointed to serve with president obama on the task force on 20th-century policing, received a ba from st. thomas university and a doctorate from wright state university, welcome. usually, i stand when i provide the oath but we welcome our distinguished witnesses and thank them for their participation today. i will begin by swearing in our witnesses. if you would please turn on your audio and make sure i can see your face and your raise right hand why i administer the oath.
10:30 am
please unmute and let me see your raise right hand. do you swear or affirm that your testimony you are about to give today to be true and correct to the best of your knowledge, information and belief, so help gods? >> i do. >> thank you so very much. with the record show the witnesses have answered in the affirmative, thank you very much. each of your written statements will be introduced into the record in its entirety. i ask that you summarize your testimony in five minutes. there is a timer in the zoom view that should be visible on your screen step director goodwin, you may begin. welcome. you are recognized for five minutes. >> thank you.
10:31 am
i am pleased to be here today to discuss several law enforcement view of facial recognition technology. this technology has expanded in recent years and questions exist regarding the accuracy of the tech algae, the transparency in its usage and the protection of ribas he and civil liberties where the technology is used. today, i will discuss the ownership and use of facial recognition technology by federal governments. also the types of activities we need the technology to support and tracking use of the technology. we surveyed 42 federal agencies that included law enforcement officers.
10:32 am
20 of them reported that they use a different facial technology system. 12 only used another entity's and five agencies own the system and used another entity. agencies noted that some systems can include billions of photos. federal agencies reported various activities such as criminal investigation, surveillance and managing business operations during the covid-19 pandemic. of the 20 agencies, 14 reported using it to support criminal investigation. that included investigations of violent crimes, and identity fraud. looking closely at these 14 agencies, some of them told us
10:33 am
they use the technology last summer to support criminal investigations related to civil unrest, riots or protests following the killing of mr. george floyd. in addition, a few of these agencies told us they used the technology on people involved in the u.s. capitol attack to generate leads for criminal investigation. of these 14 agencies, we found that her team did not have complete, up-to-date information on what nonfederal systems are being used by their employees. agencies have not fully assess the potential risk related to privacy and accuracy of using these systems. to be clear, some agencies new employees were using certain systems. agencies may have had formal agreements or contracts with private companies. however, many agencies did not
10:34 am
answer our questions and many agencies said they did not use the technology and later changed their answers. with these 13 agencies have in common is that don't have a process to track what federal systems employees are using. the accuracy of facial recognition technology has been reduced dramatically, the search will it -- produce inaccurate results. if it's not sufficiently accurate, it can unnecessarily identify people. it could also miss investigate leads that otherwise had been reviewed. the technology could result in adverse results for individuals. we recommended the agency implement a mechanism to track what nonfederal systems for facial recognition technology are used by employees.
10:35 am
and accept the risk of using the system including accuracy-related risk. facial recognition technology capabilities are only going to get stronger as law enforcement agencies do not know if and how their employees are using the technology. they cannot ensure that the appropriate texans are in place. this concludes my remarks. i am happy to answer any questions you may have. >> thank you your testimony and i want to recognize mr. barry friedman for five minutes. >> i want to thank you for giving me the opportunity to testify today. i am the founding director of the policing project at the new
10:36 am
york university school of law. our mission relates directly to what you're doing today step we partner with allstate holders and work with communities and work closely with law enforcement agencies. we have the goal of bringing democratic accountability, transparency and equity, and i want to stress to parts of that. the issue of democratic accountability -- it's our position that one of my key points is the deep technologies did not be used without proper democratic accountability. i commend the subcommittee for having this hearing to pursue something that's essential. we do work with all stakeholders. i was incredibly gratified to listen to all the members who spoke to the need and the ability to work in a bipartisan way on this issue. i believe there are strong
10:37 am
passionate feelings about these technologies but there is a way forward in the question is how to get there. the approach that we use for the policing project particular for emerging technologies is one that has a cost-benefit analysis. you look at the benefits and many of you said, there are deftly benefits for paying for this technology. many who have indicated their very serious cost and potential harms. there are racial harms from disparity, privacy harm, giving too much power to government. there are concerns about the first amendment. some people look at cost-benefit analysis and we weigh whether to do something or not. that's exactly the wrong approach. the value of the cost benefit analysis is we can find a nuanced way to regulate they
10:38 am
could maximize the benefit and minimize or mitigate harm. i have spoken to many people over the last month and years about this issue. there are strong views on both sides including people who think the technology should be banned. when i talk to law enforcement or when i am talking to several racial justice advocates or people in the technology companies themselves, even though they disagree about the use of the technology, there is a wide swath of agreement that it isn't ok about dangerous and shoddy practices. i want to dive into the nuance of those practices. issues of accuracy and bias, we heard a lot about that and i think it's time and i hope we stop saying the technology is
10:39 am
81% accurate. we should talk about false negatives and false positives. we don't know anything about the accuracy of this technology. that is because law oarsmen use the technology in a way different than it has been tested. law enforcement uses different probe images to testify. law-enforcement enforcement uses much larger databases. the errors increase as the database gets larger most a lot of people say not to worry, there is a human in the loop but i would appeal to your common sense. that can be a good thing or a bad thing. humans in the loop would be a good thing if an independent individual who had no idea what the algorithm suggested identified the same person. humans are problematic because
10:40 am
they are just confirming what has been told to them all staff we need testing and means of ensuring that if there is a human interaction, that human needs to work to make sure we increase the accuracy. i want to urge you to think seriously about regulating the vendors themselves. those are the folks who know how the technology works. they could work on whether acceptable photos were the right settings and make sure we get accurate results. i'm going to thank you for listening to my testimony. >> thank you very much professor freeman. i'm recognizing robert williams. we are the preside are's over law enforcement in this nation and certainly, we welcome your testimony. thank you. i recognize you for five minutes.
10:41 am
>> ok. thank you for the opportunity to testify today. january, 2020, i got a call from my wife and she told me that some detectives called her and told me to turn myself in. she says they will be calling me surely because they didn't have my phone number. they called me and i answered and i asked them what it was about and he told me he couldn't tell me. all he said was i need to turn myself in. i assumed it was a crank call until i hung up and call my wife and told my wife to disregard. they thought maybe i was trying to hide something.
10:42 am
i told her to call our local police department and she did and they said they had no warrants or anything and disregard the call. i proceeded to drive home and when i got home, there was a detroit police car parked on my street and when i pulled into the driveway, they blocked me in as if i was going to make a run for it. i pulled up and he said, are you robert williams? i said yes i am, he said, you're under arrest. i said you can't just arrest me for that, what am i under arrest for? he said don't worry about it and proceeded to put the handcuffs on me. i told my wife and my kids were coming out of the house don't worry about it, they are making a mistake, i will be right back stop unfortunately, i was not right back step i had to spend
10:43 am
30 hours in jail. i kept asking while i was being arrested and they said we can't tell you that. they showed me the warrant and i said how did you get them? they said the head detective got it. we were talking about things they had done and things they got away with. i was wondering why i was there and nobody ever came to tell me. the next day, when i went to court, i pleaded not guilty. they took a recess and some
10:44 am
detectives asked me some questions. they had me sign a paper saying that i will waive my right to counsel so that i could show me with on the paper. he showed me a picture and he asked when i was at the store the last time and i said a couple of years ago. he says that you? i said that's not me. and he said i guess that's not you either. i said i hope you don't think all black people look alike step another paper said the computer guy got it wrong. i said i guess so, and i am -- am i free to go. he said i'm sorry but we are not detectives on your case but we cannot let you go. they came back from ourselves. i received a bond and got out.
10:45 am
my wife contacted the aclu and we started getting information about the case and he realized that it was a wrong facial recognition policy. i've been fighting for it ever sense. i don't think it's right that my picture was used in some type of line up and i never have been in trouble. i want to say thank you for the opportunity to share my testimony and i hope that congress does something about this law and i'm happy to take any lessons you might have. >> a powerful statement, mr. williams and your tracking what i begin with as i open this hearing that not only should we engage in oversight but i think
10:46 am
we should engage in the right kind of fix and i am very apologetic to you and your family for having to go through this. our next witness is mr. bertram lee junior, you are recognized for five minutes. >> thank you for the opportunity to testify today. thank you for calling this hearing and shining a light on an undeniable threat. it poses a threat to black and brown people. we are at a turning point and must reject policies and practices rooted in discrimination toward public safety that respects the humanity, dignity and human rights of all people. leadership conference has spoken out against law enforcement use of facial recognition since 2016 , highlighting the inherent virus -- bias.
10:47 am
last month, the leadership conference along with upturn and new america's open technologies released a statement signed by 40 at cacique organizations -- by 40 advocacy groups including its use exactly -- exact -- exacerbates the harm police due in communities and threatens individual and community policy. it violates due process rights and otherwise infringes a on procedural justice, often relies on phase prints that have been obtained without consent and in addition to racial bias, the technology itself poses a disproportionate risk for black, asian and indigenous people. the possibility for misidentification poses a tremendous threat to the health, safety and well-being of communities. the reason for these problems
10:48 am
vary. in some cases, the cause of the bias was within the database where an images search and others due to the historical bias built into the algorithm itself stop the outcome is the same, black people are disproportionally recognized -- underrepresented. it entrenches racial disparities in the criminal system. in a comparison of mattress by country of origin, photos of people from east african countries had false positive rates 100 times higher than the baseline rate. researchers found facial analysis algorithms misclassified like women nearly 35% of the time while getting it right on white men. even if the accuracy was improved, the fundamental issue remains -- facial recognition technology dangerously expands the copan power of law enforcement combined with different neighborhoods.
10:49 am
algorithms could enable governments to track the public movement, habits and association of all people at all times with the push of a button stop according to a report from the georgetown center on privacy and technology, more than 133 american adults are included in facial recognition that was across the country and at least one in four state departments can run facial and recognition software through their systems. as the technology rapidly emerges in everyday police act committee, safeguards to ensure the technology is used fairly and responsibly are virtually nonexistent. as congress considers our nation of policeman -- policing and punishment, it must accelerate law enforcement ability to wield this powerful technology especially when the history provides little room think it would be used. we are saving -- we're safer
10:50 am
when our communities are healthy and thriving. policies of mass criminalization and over policing are fueled by weiss is -- racist people. it's about social control, not public safety. the leadership congress asks congress to act now by having a moratorium on law enforcement use of facial recognition tools. families and communities will endure unspeakable harmon tragedy on our governments watch as long as we allow these policies to continue. time is of the essence and the leadership conference asked forward to working with the subcommittee to address serious concerns of technology and work toward a new vision of justice that keeps can unity save. thank you. >> thank you for your testimony. and as well for that insight constitutional analysis which is important.
10:51 am
i would like to recognize miss cara frederick. thank you for the opportunity to testify today. in four years, a projected 6 billion people will have access to the internet with 80 billion devices connected to the web. this creates profound digital surveillance opportunities. facial recognition as a standard capability. some uses of facial recognition comprised of legitimate safety concerns, finding the capital gazette shooter and detecting individuals without passports. the potential misuse is also high. risks are manifold. i want to focus on three risks in particular. the circumspection of several liberties -- of civil liberties. today, the bureau has asked that over 640 million photos in some
10:52 am
cases from private companies. also outsourcing that service to unaccountable companies. they have a history of reckless privacy practices. government agencies have long his private firms to identify patterns and useful information, a renewed push to make use of out door expertise for domestic spying to counter domestic extremism gives a concern. such impulses to outsell -- to out source them, require law enforcement which leads to the third risk. the potential integration of facial integration data with other pii's for the capability of mass surveillance. it can allow governments to look
10:53 am
through patterns of citizen behavior. lower latency provides quick transmission to increase data flows. more compute power along with data that extract data fit together in mutually reinforcing ways. this can engender a climate of fear, self-censorship and the chilling of free speech and the right to peaceably accept -- assemble in public spaces. authoritarian powers of china are on the edge of using facial recognition, the expanding of this power rends -- renders the slope slippery. once these powers expand, they almost never contract stepped the trendlines lines are foreboding.
10:54 am
municipalities are using covid for extended surveillance. georgia became the first to use smart cameras. combined with near historical low levels of public trust, the unfettered deployment of these strategies will strain the health of the body politic with our immediate and genuine safeguard stop technology has long outpaced her efforts to guide -- to govern it. congress needs a potential data framework with potential oversight for having you to -- by having federal, state and local entities protected. there should be clear policies and data retention and categorize biometric data, limit and operability to stymie mass surveillance. congress should impair that u.s. identity management systems are secure and reliable based on proper measures and standards.
10:55 am
programmers should build and data privacy predictions in the design days, testing new methods of encryption or differential privacy. for federated models of machine learning for the systems stop the debate over public safety and private trade-offs in our republic provide an opportunity for us to add technology with privacy protection from the outset and shore up the system of checks and balances to address infringes that -- infringements that occur. how we decide to use and safeguard these technologies dresses closer to the state we should be or two with oratory and practices. i look forward to taking questions. >> thank you for your testimony. we are now able to recognize professor lauren for five minutes. >> thank you very much for the
10:56 am
opportunity to address the subcommittee on the topic of facial recognition technology. i will speak to a small sliver of the vast and complex topic that affects my own research on forensic science in criminal science and adjudication. the three bottom one points are these? although facial recognition technology matches have not been held is admissible, law enforcement;s uses forms the basis of criminal convictions and incarceration at the federal and state level. while facial recognition has the capacity to be highly accurate under ideal conditions, there is the greatest cause of concern for accuracy when used on federal
10:57 am
facial recognition technology leads to deprivations of liberty. courts do not currently admit testimony or other evidence to establish of the perpetrator -- at the person is the perpetrator of the crime. some have been relied in on courts that are not governed by rules of disability like an bail hearing's. more critically, the overwhelming majority of mental executions begin with no presentation of evidence before a jury at all. 98% of terminal convictions obtained are guilty. in those cases and giving the design of the low evidence share threshold of primary cause, facial recommended matches can generate a criminal conviction without any courtroom testing of the evidence. even when facial recognition is
10:58 am
used by law enforcement, it can only generate an initial lead in that match has the potential, given the stickiness of cognitive biases in the real potential for facial recognition matches to identify identification. the capacity to send investigators down and erroneous path that has enormous consequences from a badly accused person. as others have noted today, despite significant advances, significant sermon -- significant objections remain about facial recognition technology. the fact that there is significant variability among facial recognition vendors and
10:59 am
then there's will rely on a variety of unknown sources points to an neat for a disclosure and what private and systems are being used. aside from variability and inaccuracy across algorithms, law vestment tesla enforcement raises special accuracy concerns because it is particularly likely that less than ideal images will be used. studies of facial recognition find low error rates only when using cooperative images, declines in accuracy are seen when using images not captured under ideal images, the types of images used in criminal investigations. none of this might give to a rise to intervention if the criminal legal system already led to the adequate use of facial recognition. this is due to the fact that as long as facial recognition
11:00 am
matches are used as investigative tools and not relied on as evidence, there is a lot of opportunity for a defendant was identified to challenge the practice in court step restrictive discovery, the sharing of case information between the prosecution and defense, means the defendants are severely hobbled in litigating use of facial recognized -- facial recognition technology. cases do not have access to the investigative file. they may not even learn that a facial recognition procedure was informed. the timing means little opportunity exists for the defense to vet the reliability of evidence before entering a plea. the defense claims they are constitutionally entitled to information about frt procedures under the holding of the case brady v. maryland, have gained little or no traction in courts. in sum, frt is an alluring forensic tool that has the potential to accurately open otherwise unavailable
11:01 am
investigative avenues, but also has the potential to inject a new source of error into criminal adjudication. troublingly, the federal criminal legal system is not welldesigned to smoke out questionable uses of frt through the ordinary operation of processing criminal cases making consideration of other means of regulation and oversight especially necessary. >> now i record eyes brett tolman for five minutes. -- i recognize brett tolman for five minutes. >> thank you for the opportunity to testify today. my name is brett tolman, and i am the executive director of right on crime, a conservative organization dedicated to the promotion of criminal justice policies that promote public safety, individual liberty, and whole communities. i have
11:02 am
previously served as the united states attorney for the district of utah, as an assistant united states attorney, and as chief counsel for crime and terrorism for the united states senate judiciary committee. the past decade i have also worked in private practice as the founder of the tolman group, focusing on government reform, criminal defense, and internal corporate investigations, and previously as a shareholder and chair of the white collar, corporate compliance and government investigations section of the law firm of ray quinney & nebeker, pc. i am encouraged by the subcommittees decision to hold a hearing today on the use of facial recognition technology by law enforcement entities. it is an area of increasing concern that is ripe for abuse and could stand to benefit from congressional the first issue to address is a concerning lack of transparency and basic information relating to law enforcement's use of facial recognition technology. many of the fundamental questions that you likely want and deserve answers to are currently unanswerable. questions such as: how many law enforcement entities use facial recognition? how often do they use it? who is in their databases and why? to the extent that we have partial answers to some of these questions, the facts are daunting. recently, the government accountability office revealed that beyond those federal law enforcement agencies one might suspect of using facial recognition, like the federal bureau of investigation, there are a host of others with less
11:03 am
apparent need who also use the technology. for example, the u.s. fish and wildlife service and the internal revenue service. nationally, one estimate placed the government market for facial recognition in this country at $136.9 million in 2018, with the country at $136.9 million in expectation that it would nearly triple by 2025. another estimate suggested that as of 2016, at least 1 in 4 police departments had the option to run facial recognition searches, a number that has surely grown since then. also discomforting are the sources of the photos supporting this facial recognition technology. in addition to drawing on pictures secured through the criminal justice process, the government utilizes millions of photos of law-abiding individuals collected for drivers licenses and passports.4 private technology companies that contract with law enforcement have harvested billions of photos posted by unsuspecting users on platforms such as facebook, youtube, and even venmo. this collection is an unprecedented invasion of privacy that places enormous, undue control in the hands of the government and big tech, two entities not always known for their light touch or responsible use of power. mass surveillance compounds the issues surrounding mass collection. walking out the door in the morning can be an exercise in skipping from one camera to another, as everywhere from the convenience store on the corner to the traffic camera at the light in front of it record our actions. to this perpetual passive surveillance, law
11:04 am
enforcement can potentially add recordings from body-worn cameras or simply from the smartphone in an officer's pocket. in short, there are very few instances where law enforcement will not have the opportunity to subject a person of interest to facial recognition technology. inevitably, the first temptation will be to use facial recognition by default rather than necessity. instead of limiting the practice to serious public safety risks or only after due process has been afforded an individual, officers may use and have already used the practice for run-of-the-mill interactions and minor cases. our founding fathers deliberately and prudently enshrined in the bill of rights proscriptions on the wanton search of americans as a necessary bulwark for freedom. it is hard to square these notions and protections with the unfettered use of a technology that can instantaneously reveal an individual's identity as well as potentially their associations and prior movements. the unrestricted use of facial recognition bears little difference to allowing police to collect fingerprints or dna from anyone and everyone they come across, an abuse that we clearly do not, and should not, tolerate. the implications of such access
11:05 am
to personal information become especially troubling when dealing with real-time interactions between law enforcement and members of the public. americans have long prided themselves on our ability to refuse the government unless it has legitimate cause to interfere with our liberty. our police, at least in the absence of reasonable suspicion of wrongdoing or additional judicial process, are supposed to rely on the consent of the citizenry in their interactions. facial recognition has the power to stand this principle on its head and turn questions from law enforcement about personal information into mere rhetorical ones. in addition to identifying individuals, facial recognition technology makes it much easier for law enforcement to track them. it can essentially automate surveillance across the network of cameras mentioned above, following individuals with minimal effort or oversight. it could also mean historical tracking. once somebody lands on law enforcement's radar, there is little stopping officers from running their image through available videos and photos to reconstruct past days, their associations or affiliations, and other aspects of their life or routine. furthermore, we cannot ignore the risk that facial recognition technology will be used to target certain americans. facial recognition can instantaneously strip away the anonymity of crowds, and ultimately threaten our
11:06 am
constitutional rights of assembly and association. consider the chilling effect facial recognition could have if used to identify individuals at a political rally or government protest. this technology could be used to identify everyone who simply visits a gun store or any other place an individual is exercising a personal freedom or choice that a particular politician or other individual heading an agency considers undesirable. this practice grossly lacks the necessary accountability. right now, we have little more than vague assurances that we should trust the government to safely use the incredible power of facial recognition technology without such information or over - sight. not long ago, i was tasked with leading the effort in the senate to reauthorize the patriot act. we heard similar assurances years ago, by those leading the department of justice and the fbi about fisa and those surveillance authorities not being turned against honest americans. we have seen how that worked out as outlined in the recent, and disturbing, inspector general reports. none of this is to say that law enforcement should never have access to facial recog nition technology. while china's unconscionable use of facial recognition technol ogy to enhance and accelerate
11:07 am
its undemocratic control of its citizenry is a warning of the costs of failure, i do not believe it is the inevitable consequence of any use of facial recognition technology by law enforcement. further, it is unrealistic to expect law enforcement officials to permanently deny themselves a tool that is increasingly prevalent in the commercial sector and which has such powerful capacity to improve public safety. it is easy to contrive a scenario in which we would very much want law enforcement to have facial recognition technology available searching for a known terrorist loose in one of our cities presenting an imminent risk to the public, or seeking to identify a mass-murder suspect, for example. and while i harbor a conservatives healthy skepticism of government, i respect members of law enforcement and will always seek to support them and their mission to keep us all safe. however, acknowledging there are credible uses for facial recognition technology and voicing support for law enforcement is not the same as writing a blank check for power and then looking the other way. in the past, members of this subcommittee and others in congress have proposed a moratorium on the use of facial recognition technology by law enforcement. significant restraints might be
11:08 am
necessary for privacy protections to catch up to the rapidly advancing capabilities of this technology. if nothing else, we need a reset in which law enforcement agencies must gain affirmative permission from relevant democratically elected representatives to use this technology prior to its use; that way transparency, accountability, and proper use guidelines can be established. as it stands, it is difficult to see how the regular, widespread use of this technology can meet the high standards of our constitution and its protection of civil liberties or the norms inherent in a democratic society. if one considers this technology as creating the ability to find a needle in a haystack, it seems entirely reasonable to demand the police to first know which needle they are looking for, rather than search every haystack in the off chance it contains a needle. absent an emergency or other exceptional circumstance, some manner of reasonable suspicion should likely be a prerequisite to a facial recognition search. likewise, we should be exceedingly cautious about the deployment of real-time facial recognition, which has more disturbing fourth amendment implications. finally, we need transparency on when, how, and why every law enforcement agency is using facial recognition and which databases they are drawing from, and we should have particular reticence when it comes to the collection of images from unwitting, law-abiding individuals. this is a pressing issue pertaining to all americans fundamental rights. as someone who has spent a great deal of
11:09 am
time working on legislation in this arena and some one who fundamentally believes that smaller government is better government, i am encouraged that the subcommittee is concentrating its focus on these concerns today. i expect that its resolution will require many conversations, careful balancing of tradeoffs, and potentially difficult decisions. i look forward to contributing however i can to that effort today. >> thank you for your testimony. . i now recognize dr. cedric alexander for five minutes.
11:10 am
>> on july 9, the portland, maine, press herald reported on two newly enacted laws that make maine the first us state to enact a broad ban on government use of facial recognition technology. in maine, state, county, and municipal governments will not be allowed to use any sort of frt. law enforcement may use the technology for investigating certain serious crimes, but state law enforcement agencies are barred from implementing their own frt systems. they may request frt searches from the fbi and the state bureau of motor vehicles in certain cases. a year earlier, in august 2020, the portland city council totally banned the use of facial recognition by the police. indeed, currently some 24 american cities ban police frt use, and black lives matter has called for a nationwide ban because of evidence that false positive identifications in the case of people of color exceed the rate for whites. so far, the federal
11:11 am
government has enacted no laws on how law enforcement may or may not use frt. the benefits of facial recognition systems for policing are quite evident. it is a fact that the technology can aid in the detection and prevention of crime. for example: facial recognition is effectively used when issuing identity documents and is usually combined with other long-accepted biometric technologies, such as fingerprints and iris scans. frt face matching is used at border checks to compare the portrait on digitized biometric passports. in the united states, at least 26 states (perhaps more) permit law enforcement to run searches against their databases of driver's license and id photos. (the fbi has access to the driver's license photos of 18 states.) frt was used to identify suspects in the january 6 violent breach of the u.s. capitol, and high-definition digital photograph and videography (sometimes deployed from aerial drones) may be used increasingly to identify faces in mass events. facial recognition closed-circuit tv systems may be used in such public security missions as finding missing children and disoriented adults, identifying and tracking criminals and criminal suspects, and generally supporting and accelerating investigations. currently, the fbi is the nation's leading law enforcement agency using frt, and it has developed technologies and best practices to promote the intelligent use of the technology to reduce errors and protect constitutional rights. in addition, the facial identification scientific working group (fiswg), operating under national institute of standards
11:12 am
and technology (nist) organization for scientific area committees, is working to develop standards in frt. facial recognition technology has been useful in law enforcement and, i believe, will continue to develop technically and therefore become even more useful. blanket bans on frt in policing are unwarranted and deny to police agencies a tool that is an important aid to public safety. >> i understand the importance of this technology, but i also understand how it can be easily abused if those are not properly trained, certified in the utilization of practices are not evident. thank you, madam chair. >> thank you to all the witnesses. we obviously have opened up in the parochial and often used term nonlegal a can of worms that we need to address and we will now be able to proceed with members questions. i am grateful for the attendance to all members both on the minority side and on the
11:13 am
majority who will now proceed under the five-minute rule of questions. i will now begin by recognizing myself for five minutes. director good one -- goodwin, you know the study that differed in accuracy. one of the concerns related to facial recognition system accuracy? >> thank you. number of concerns have been raised in our report concerning accuracy and i will focus on one. think about a person's face. it is distinct, it typically does not change. if a person's face, if that information is breached or hacked, that is totally different than your password being breached. you can change your password,
11:14 am
but you cannot change her face. in the report, we noted this concern. the misidentification based on a facial image is of great concern . that is in our report. >> thank you. mr. williams, you were wrongfully arrested in front of your wife and child because of facial recognition. your study is stunning. holding up a picture alongside of your face, your comment do we all look alike, you were suspect allegedly in an investigation. muscular's dropped the charges and the detroit police chief apologized for sloppy detective work. tell us what impacted -- what impact your detention had on you and your family. you're a gentleman with employment and family. how did that impact you? >> thank you.
11:15 am
the first thing when i came home on the drive home, my wife was saying how she had to figure out how to get in contact with the detention center because it kept hanging up on her and telling her they can't answer any questions because they don't have any information just that i am there and that's all they could tell her. when i got home, i asked my wife what was going on and my daughter had took pictures of us and turn them around because she said she could not bear to look at me while i was in jail. i had to go talk to a five-year-old at the time about a mistake that had been made by the police and it is hard to understand that. sometimes, they make mistakes. you tell her as a child that the
11:16 am
police are your friends and if you ever get in trouble, this is who you go to end this is who is going to help. -- this is who you go to and this is who is going to help. we thought about taking her to a psychiatrist because when we replay the video from what happened, she cannot stand to watch it. she gets emotional. it is understandable and i have to go through that. it's painful. thank you. >> thank you. professor friedman, one of the privacy implications and risks to communities if a private company has access to details of law enforcement investigations? you might add your thought about a legislative fix about how we
11:17 am
address this question. >> that was an incredibly moving story. i have kids, and i can only imagine. the problem as i think mr. out xander pointed out is the use of this -- mr. alexander pointed out, we are going to have mistakes, all kinds of trouble. there are uses of the technology for law enforcement, but we need a full set of regulations. i set out a bunch of them in my testimony. i think you could do a lot of this by regulating the vendors directly. you could set a number of standards including that the companies themselves have to regulate and identify what are effective images. the size of the databases, the threshold to avoid false positives or false negatives. that the technology is self auditing so we know how it was
11:18 am
used by law enforcement. a lot of things that can be done. >> thank you. dr. alexander, what are your thoughts on how law-enforcement authorities can reduce the phraseology that the detroit officials used sloppy detective work. dr. alexander? >> thank you, madam chair. having been a former chief, that would suggest in me that i have a larger systemic issue on the way investigations are conducted. here is the issue for me. the technology is one in which we use every day. we pick up our cell phone, we use it to access calls or text messages. the biggest issue here for me and will continue to be when you have victims such as mr.
11:19 am
williams, people who are innocently going along their way , because of failed ability for us in policing, the inability to train people appropriately. they have to be trained, there has to be certifications. there has to be best practices. there needs to be certification i believe at a state level and a federal level that has to be ongoing. because one thing that we know currently is that we don't train enough. when you're utilizing that type of technology that can easily infringe on someone's fourth amendment rights, i think we have to be very careful. we also have to remember where we came from even in dna technology. there we have seen over the years particularly in the past, we have seen dna labs shut down because of people's inability to carry out that professional function based on certification
11:20 am
and accuracy. we don't want to find ourselves using this technology doing the same things that we have done in the past, because what it will continue to do is derive the proverbial -- drive the wedge even further apart between policing and communities and that is not a road we want to continue to go down. >> thank you. my time has expired. i know that there are many other questions that i have interest in. let me yield to the ranking member of the subcommittee for five minutes. >> thank you. ms. goodwin thank you for being here today. he found that 13 of the 14 agencies you surveyed, that reported using nonfederal systems did not have complete
11:21 am
up-to-date information about what systems their employees were using. is that correct interpretation? >> yes. >> even one of the agencies you worked with had to pull its employees -- poll its employees to see what systems they were using. >> that is correct. one of the agencies initially told us that they did not have the technology. when they asked their tech -- there employees, they found out it had been used in over 1000 searches. >> if agency leadership does not even know what systems their employees are using, how can they be sure they're not being misused and that the constitutional rights of american citizens are not being violated? >> this speaks to the heart of the recommendation that we may. that agencies have an awareness of what systems they are using, particularly the nonfederal systems their employees are
11:22 am
using. then they can begin to ensure that appropriate protections are in place related to privacy and accuracy. >> ms. frederick , we know that agencies are using private searches and they are collecting those images. how are those images acquired? >> it depends on the private company. a lot of these companies, they claim to scrape the internet, social imaging -- social sites like facebook. they are scraping the internet where people are uploading these photos. people are not consenting these images to be used by law
11:23 am
enforcement, but by using these contractors, they are using your regular photos from your grandmother, your family, your sisters, your daughters. it is pretty much a government infused wild west out there. >> mr. tolman, these private systems, let's leave out law-enforcement systems, what steps are they taking to secure their systems? >> the reality is, they are more interested in the dollars that are behind this technology than they are securing civil liberties. that will always be the case when private companies are contracting and utilizing a powerful technology. their interest is going to be driven by their bottom line. >> these agencies like federal
11:24 am
agencies that are using private services, is it just easier for them to contract out than to develop their own system, i assume. >> yes, the capability is hundreds of millions of photos versus what they would be able to gather themselves with limited staff and resources. >> how are we going to ensure that americans are not being targeted by federal agencies using this facial recognition technology? what are your politics christians? -- what are your policy prescriptions? >> the legislature is going to have to enact regulations that require any agency, whether they are contracting or not with big tech or other companies, they are going to have to have policies that allow for the proper legal use of this.
11:25 am
i prosecuted the kidnapper of elizabeth smart. i imagine during that case when i was a prosecutor, we would have clearly thought out law enforcement's ability of that to try to find her. that is a far cry from downloading all from social media what you can then targeting others without the legal justification or the regulations or policies have been approved by congress. >> mr. lee, i'm good to ask you the same question. what policy prescriptions do you advocate that we can ensure americans are not being targeted by federal agencies or other law enforcement agencies? >> that the leader -- the leadership, we support a ban or moratorium because these communities where facial
11:26 am
recognition technologies are being used are being over policed. facial recognition technology is disproportionately inaccurate for black and brown communities. the community has not been involved in these processes. congress has not engaged in legislation on these topics. communities have not been involved in highlighting what they think policing should be specifically within the context of facial recognition technology. >> thank you. >> i now recognize the chairman of the committee. >> thank you. facial recognition -- on your report you found agencies using systems owned by another federal entity and 14 reported using systems owned by state, local, tribal entities including the fbi. can you describe how the fbi uses -- systems to expand facial
11:27 am
recognition search capacity? >> i will briefly talk about how the fbi used the technology around the events of january 6. the fbi has media to plan. -- to line. -- tip line. they asked the public to submit images. they then inserted those into a frt system to get tensile matches. -- potential matches. we did a report a few years ago on fbi specific use of the technology and we had issued a number of recommendations that spoke to them complying with privacy impact assessment and another system. the fbi ultimately they agreed
11:28 am
with our recommendations and they implemented the recommendations. they have i think it was dr. alexander had mentioned that the fbi has done a number of things to ensure the accuracy of the frt systems they are using has improved. >> also, the report described how agencies you -- use nongovernment systems. would have the risks associated -- what are the risks associated? >> the privacy impact assessment , those are standards that typically if a government agency is using their own system they have to follow. the concern that we mentioned is that if they are using on nonfederal entity for facial
11:29 am
recognition technology, that nonfederal entity may or may not have to comply with the recommendations and the guidelines laid out. that is the concern. with one, with the privacy impact assessment, it speaks to how the images are being used. the other says if there is a change in the technology, that has to be made public and available. >> professor laurin using the technology to match to a suspect photo, what technology is used to challenge the use of the facial recognition technology that led to their arrest? >> in many instances today, it is quite limited the opportunity to do that. practically, nonexistent to the extent that the practice has
11:30 am
been for law enforcement agencies to use the facial recognition match as a lead that is corroborated through other evidence. the risk is that that ends up not being independent corroboration. in florida, a match was made based on an image in a store and a police officer who knew that the match had been made looks at the image himself and says these guys look alike. already knowing that the computer had assisted a match down on it. -- doubled down on it. the courts said they don't have the right to challenge the facial recognition match when the government isn't relying on it in court but that misses the fact that the match could taint what is showing up in court. >> facial recognition systems
11:31 am
have been found to be less effective in detecting people of color and women. as we heard from mr. williams, he experienced this flaw in real time when he was misidentified despite his photo not resembling the suspect. can facial recognition technology be incorporated fairly into strategies despite this flaw in the algorithms? >> it is a pleasure to see you again especially under the circumstances. the leadership conference does not. currently, the issues of accuracy alone would hold that a ban or moratorium would be the most appropriate use. in addition, these technologies are being used and communities that are already being over policed or over surveilled. there is no way to get around
11:32 am
the issues that come with law enforcement use of facial recognition technology. >> mr. lee? >> i want to echo the brave concern over the technology that has these differentials, i said that they should not be used unless that can be addressed. i joined mr. tolman. there may be ways to deal with the demographic disparities if we understand they are, but we don't have that testing so we have learning to do. we have to think about what it means to have a shipment in the loop. -- what it means to have a human in the loop. as mr. william story demonstrates -- demonstrates, they don't do that. that is why we so desperately need regulation. >> i ask unanimous consent to
11:33 am
place the following into the record. a letter from the security industry association, individual statements from data for black lives electronic fan -- frontier foundation. >> without objection. >> thank you. i would like to lead off by thanking our ranking member, mr. jordan for allowing me. i have a speech and it will take me a while to get there so i appreciate him switching spots of me. -- switching spots with me. in addition to being a senior member for quite a few years, i also serve on the foreign affairs committee and i am the ranking member and past chair of the asia and pacific subcommittee and in that
11:34 am
capacity, republicans and democrats have been very closely watching, observing, monitoring what the prc and chinese communist party are up to and the threat that they posed to americans and their citizens. it is known, i think it was noted in the statement of a few witnesses, the chinese communist party uses a vast civilian surveillance network to track nearly every movement of every single person and there is 1.4 billion people who they are tracking. it's a pretty extensive effort that they have undertaken. for example, they forced their citizens to download apps that monitor them. they track social media posts to look for what they consider to
11:35 am
be negative content. they stop people at random to check their devices for apps deemed to be dangerous to the state. they use facial recognition technology similar to what we are discussing today to pick individuals up that they are looking for in crowds. all of this is in order to build a so-called great firewall between china and the rest of the world. so they can keep their people under control and to gather personal data on their citizens which can be flagged to torment the people of china, individuals they believe are going to be unhelpful to the regime. i would like to ask ms. frederick and mr. tolman, would you care to comment on china's
11:36 am
use of facial recognition, other technologies to monitor their citizens and how the ccp is using facial recognition technology to oppress individuals and various groups, the uighur community comes to mind because they have been in the news not nearly enough of late when you consider that a million people are there. are there lessons that we should learn from this? i am not suggesting that our government has anything like this in mind, but i would like to hear from you if we could. >> thank you. i will begin if you don't mind. yes, we have a system in the united states that matters. law protections, open and
11:37 am
engaged citizenry, a free press that act as guardrails for now against the propagation of technology for political ends that are anywhere near like china. it is important to look to them. they are the bleeding edge of using these technologies for internal control and stability. they pervade a path for ethnic targeting in the western region where 1.5 million readers are imprisoned -- 1.5 million people are imprisoned. they use something that human rights law has reverse engineered. they take all of this data, they fuse it. it is license plate readers, these things that pick up sim cards, biometric iris scans, the
11:38 am
digital exhaust that we are constantly emitting. they put them altogether and they use this information, they use new technology, artificial intelligence, this is one of the first places they started using it for political identification of the minorities. what i see coming down the pike is the use of dna genotyping. they are trying to use an individual's dna to predict what their faces will look like. they are experimenting with voice recognition technologies and the idea is to combine all of this into this digital -- to use it for basically an iron hand of digital control. we have the technology to do so. we have seen in hong kong the diminishing of freedom. china is using technology in order to enact it.
11:39 am
russia is doing the same thing when it came to protests in january. all these things are something to keep in mind. what china is trying to do is totalizing control. they're trying to link digital reality with physical reality and that chasm that used to exist is growing closer together. china wants them to be completely interconnected. we have to guard against that here in america. >> perhaps mr. tolman could respond in writing so i don't hold the committee up at this time. >> thank you for yielding back. i now recognize the gentlelady from california for five minutes. >> thank you. let me thank the ranking member
11:40 am
and our witnesses for being with us today. i wanted to know if the witnesses if anyone could speak could distinguish between using facial recognition to investigate versus facial recognition for arrest? is it more productive to use it that way versus all of the complications for arrest? it is very well known and has been said by numerous panelists and members of the challenges of facial recognition people of color. i am not sure unless i missed it , does it work better if it is white? how does it perform? mr. alexander? >> i will try to respond to the first part of your question whether the technology should be utilized with the idea of making
11:41 am
an arrest based on the identification itself. can it be used as an investigative tool? when i think about it as an investigative tool, if that person's image happens to be a match or similar match and we are working with zero leads in the case, it may prove to be of some value. the thing we have to be very careful of, and i think someone mentioned it earlier and articulated it better than i can is that we have to make sure that we cognitively don't approach that individual already with our implicit bias in mind that they are the exact person we are looking for. do you follow what i'm saying? we have to be very careful with that. ideally, i was here two years ago under the leadership of
11:42 am
another hearing with congressman cummings having these conversations and i recollect clearly there was a woman there who was a researcher at m.i.t. who spoke elegantly around these issues and was able to provide insight in terms of the depth of this technology in how it does not work to the advantage of people of color and women. there is an overwhelming amount of evidence, and i think there are those on this panel who can talk further about this, where in the case of white males there was least likely opportunity for misidentification. certainly those of color and women, it tends to be more problematic. >> you said at least likely meaning that it is more accurate with white men? >> yes, it is more accurate with
11:43 am
white men. >> may someone else could respond to that. what about that in terms of accuracy and also its use as an investigative tool? in the george floyd act, we say that you should not be -- there should not be federal resources in facial recognition. >> i was just going to say about the question you had previously asked that it was supposed to be used for an investigative tool. at that point, they came and arrested me and did not even tell me anything. i only live in detroit. in detroit, police came to my house and carted me off.
11:44 am
i don't have a lot to say about that, all i am saying is the way it is being used, they are not doing any due diligence at all. like the chief said, it was shoddy investigative work done. >> to me, you are an example of what i was not saying. they used it for your arrest. that's clear it is problematic, but i was referring to using it to investigate, not in real-time for an arrest. mr. friedman, do you have anything you would like to say? >> i think there are things that this body could do regulatory late -- regulatorily. the kind
11:45 am
of testing that flex what is happening in law enforcement use rather than a set of abstract possibilities for the accuracy and bias. >> thank you. >> thank you. mr. ranking member, i am moved to say it appears like unreasonable search and seizure under the fourth amendment. i am delighted to recognize mr. jordan for five minutes. >> let me go to mr. tolman. how often do we get the wrong person? where mr. williams told his compelling story of how they were wrong and he was arrested and detained for 30 hours. i know one family, a story of a company -- in alaska they were
11:46 am
not in washington on january 6 but they were handcuffed during the interrogation. it was all based on facial recognition. how often do we get the wrong person? >> some estimate as high as 30%. chi believe it is more than that. -- i believe it is more than that. i have represented two individuals, one of whom and i am fairly certain we don't know about the ones who aren't coming forward, but i have a client who was accused of going inside the capitol building. he was not in there, i believe it was facial not -- recognition technology that put him in the law-enforcement investigation. we had to work backwards to show that he was not. i think it is more often than we
11:47 am
know, because it is not being reported. >> mr. lee, he said 30%. we know that it is a disproportionate effect on african-americans. is that probably, 30% of the time this technology gets the wrong person? >> we don't know. that is part of the problem. these technologies lack the weapons of transparency -- the requisite transparency and oversight. in cases like mr. williams, those are terrifying. what about the cases that are on the edge that we don't know about right now? we do need more transparency and oversight, specifically with facial recognition technology and the use by law enforcement. >> i am concerned in a general sense of the overall surveillance state and individual americans, the fundamental liberties and
11:48 am
fundamental rights. when you add on top of that, it is wrong so often, to meet this is why we can't -- if we can't come together in a bipartisan fashion, we have been trying for a couple of years, if we can't do that and find a way to limit this particularly when we have these agencies using the technology they don't even know what technology they are using and whether it is private or what have you, there's a host of problems here. i appreciate the chairman having this hearing and i'm going to yield the remaining two minutes to the ranking member. >> georgia used ai driven smart cameras to monitor social distancing. with current technology, what prevents federal agencies or state and local governments from using facial recognition not just for identification, but to track movement and behavior of
11:49 am
law-abiding citizens? >> i hear that's what's coming down the pike. right now, there are specific transparency measures and lukewarm strictures on some of these uses. the recommendations are great starting point. at this point, there are no hard constraints on the expansion of a remit for these cameras. one could encompass all sorts of things and clearly certain municipalities are taking advantage of that to interpret public safety safety as they wish. until entities are required to submit those reports that don't yet exist in a rigorous fashion, i hear many official entities will be allowed to interpret the use of facial recognition,
11:50 am
digital surveillance in any way they can. i will say quickly when i was at facebook, the eco-'s of private companies in shipping these -- the ethos of these private companies shipping these products, they want to ship them as fast as they can. >> i need to ask one more question. professor laurin with regard to the question on investigation versus arrest and what has come to my mind is this notion of tainted photo lineups, tainted identification being used in investigation as predicate for making an arrest. did you want to elaborate on that? >> thank you for the opportunity.
11:51 am
i will say a bit more, to say that i do think it is possible in theory to think about a distinction in the use of the technology as an investigative tool versus forming the basis for an arrest or put differently, of facial recognition match providing probable cause. from a regulatory standpoint, one needs to take care about what strictures need to be put in place to create a meaningful distinction between those two things. there are a number of agencies that say facial recognition itself cannot be the basis for probable cause. there are very few if any strictures around what further investigation is required to build an independent case. are there for example blinding procedures so that individuals who are doing a subsequent lineup don't know the facial recognition initially matched
11:52 am
the individual? if not, that is a meaningful independent check. there is an important intersection here between facial recognition technology and what we already know about the fallibility of eyewitness identification. which in over 75% of known dna exonerations, there were now known to be wrong eyewitness identifications that occurred. that is an investigative tool that itself is already quite fragile. when used as a corroborating piece of evidence with the fragile facial identification, that should give reason for concern, as well. when thinking about what strictures need to be in place creating a meaningful check on the investigate -- on the meaningful use. >> my time is long expired, thank you. >> it is an issue that warrants
11:53 am
a lot of involvement and we thank our members and i am delighted to recognize the gentlelady from florida who has five minutes. >> good morning to all of you. thank you for this very important discussion. in q2 all the witnesses for being here. -- thank you to all of the witnesses for being here. as a former police executive, i am always interested in new technologies that can help to keep the public safe, helped to keep law enforcement safe, and improve the delivery of police service. we know that we have work to do in all of those areas. it was a decade ago that we were testing body cameras at the
11:54 am
orlando police department. while we were not sure of the road ahead, i know the public was very excited about that possibility. it is one of the recommendations that is coming to the forefront now. we should always seek technologies that will enable us to better do our jobs. dr. alexander, you said that technology has long outpaced the chance to govern it. that is such a powerful statement, because not only does it appear that the use of facial recognition outpaced our ability to govern it, we see that with other technologies as well. dr. alexander, you talked about best practices. until we can find ourselves at a place where we can keep up through regulation and other things, can you expound a little bit on some of those best practices? >> thank you very much.
11:55 am
i think it is very important for us to recognize that technology does move along quickly. if we think about the development and implementation of technology over the last 10 to 20 years or even last five years, it often times outpaces us as human being's. we are not prepared oftentimes for the unintended consequences that come with the technologies we are all utilizing every day. we have to develop best practices. what are some of those best practices? i think we can only clearly define once we make some determination about the real judy missy of this -- real legitimacy of this technology. it can be innocently used with our cell phones to open it up
11:56 am
and conduct our personal business. when it comes to the constitutional right of american citizens across this country regardless of where they are, that becomes more problematic. can we use it as an investigative tool? you know as well as i do, we might embark upon an investigation when we have zero leads. maybe we have a piece of this facial recognition that suggests could that be cedric alexander? we cannot do is what happened to mr. williams. let's look into the background of cedric alexander further. let's explore some other things that we didn't think about even had a suspect at all or person of interest. the problem lies there, sometimes we can identify an
11:57 am
individual that may be a likely person of interest or suspect, but if our own implicit biases get in the way, it might overwhelm the entire investigation and it leads us down this dark path of where now we are doing injury and hurt to people. here is the thing i would mention to all of you in congress. as we go through this place in america right now, around police reform, one thing we do not need is a piece of technology that is going to continue to drive a separation between police and community. people want good police. we all want great technology. the idea around this technology is its great, it's well-founded. it has to become more accurate. it has to have the type of
11:58 am
legitimacy, trust in this technology that is needed in order for the american people to feel like it is not going to infringe upon the rights and hurt people in any type of weight. i -- any type of way. i understand the importance of dna technology and how it is used to free individuals who never should have been incarcerated, like mr. williams. we have to do a better job in advancing this technology, but understanding that sometimes, the technology moves along faster than we do as human being's. we have a lot of catching up to do around many of the social challenges we have in our great nation. i hope in some way, that addresses your question. >> it did. with that, i yield back.
11:59 am
>> thank you very much. >> thank you, madam chairman. this really warms my heart to hear witnesses from both sides having so much good input. it is my observation during the bush years that the department of justice was able to convince republicans, a majority of them, we don't need any reform right now and really pushed back and convinced republicans, we know of resistance like that. i know of obama democrats that were in charge not to do some reform that was needed. when i hear the testimony and
12:00 pm
the questions and hear this in this is such a great bipartisan way, something so serious, it really does warm my heart. great comments and great questions. dr. alexander, i note you are careful to make sure that you note this technology can be helpful and it gets me thinking about our fbi. you think the technology on recording statements, instead of the fbi having their agents not record statements, audio or video, or write down their version of the witness -- do you think the technology has gone too fast for the fbi, that may be we can get them to start recording with audio, at least, may be video? and i am being sarcastic, but that is an area of concern for
12:01 pm
me. if you combined facial recognition mistakes with the fbi's ability to only record, only write down what their version of what the witness says, you combined flaws like that, we are looking at serious problems at the federal level. local, i am sure that your law enforcement in which you oversaw recorded statements of, that would not be such a problem, but can you see regulations, dr. alexander, that would help us be able to confine facial recognition just to the investigative stage, so that -- and as a former judge, you know, i know sometimes law enforcement will come in and go, judge, this is an affidavit, but we -- if you did not swear to it, i do
12:02 pm
not want to hear it. but that discrimination of real sworn evidence to unsworn evidence is not always distinguished by judges. do you see as actually getting teeth into that? dr. alexander: thank you for that question. let me say, i think the fbi has the tendency, has the ability, their training, and also has the fidelity of a federal law enforcement agency that is well trusted. everything they have pursued, particularly around technology, they really stood up on technology if you go back 100 years, in terms of the work they did back then, and they do today, forensically. but they do have great capability of being able to access that type of technology and use it in a way that is standardized, one in which they
12:03 pm
really are the leaders around the utilization because they use this technology to go after foreign actors who want to do harm to this country, or to utilize this event domestically. >> my time is short, but i am very concerned about some of their abuses. and when we see the fisa court that meets in secret, and there does not seem to be checks. like i say, as a former judge i am very offended that some judges have not been outraged, like mr. williams's treatment and others. i think that is a problem, when the judiciary is not upset enough to actually take action and bring law enforcement into line. but let me ask mr. toman, do you see a way to get specific regulations that would discern
12:04 pm
between investigation and actually going to court and getting a warrant, and picking up somebody like mr. williams? mr. toman: i think that you could implement things like not allowing further technology to launch an investigation, that they would have to have an independent basis to launch the of a signature, than if it went to trial you go to utilize the facial recognition technology as additional evidence, but you cannot have it the other way around, then you have concerns if the evidence corroborating the facial recognition technology is influenced by the subjectivity of the investigators. >> thank you. my time is expired. thank you. >> this is an opportunity for us
12:05 pm
to move forward. i will recognize the gentlelady from georgia for five minutes. >> thank you so much. i really appreciate you today. and thank you to each and every one of you today, thank you so much for your your timely resources, information and expertise. i want to start with a quick question. dr. alexander, looking at how we might reckon -- regulate facial recognition? how can we build trust between law enforcement and of the communities that they serve, especially communities of color? dr. alexander: fundamentally, that is where a lot of the issues lie, because currently in this environment we are in, this is not new, this has just been exacerbated over the last several years. there's a great deal of mistrust between police and community. any time, if you are already
12:06 pm
dealing with that in the current context of where we are right now in this country, but as we continue to introduce new technology and ways of doing things that are questionable, such as facial recognition, if we do not do it right and we end up with continued victims like mr. williams, and we continue to -- to create that separation where legitimacy around policing continues to be questioned, and this is the fight i find myself fighting every day in the roles i play now, is really trying to deal with these relationships, but it certainly has its challenges associated with it. but we have to get the technology right because if we don't it will not help the cause in building those relationships. if we do not have that collaboration between policing and community, i do not care where it is, we will continue to
12:07 pm
fight ourselves here, and we will continue to be at risk. >> thank you so much and stay in the fight. professor, your testimony said the rules of discovery, the sharing of information between the prosecution and defense are restrictive, so prosecutors do not have to mention which technologies they may have used to investigate and if the courts do not get much opportunity to advance the law by sorting out what is except the bull and unacceptable uses of this technology. could different discovery rules help with this kind of problem? >> thank you so much for the question. i think the short answer is yes. and just at the risk of going a little bit into the weeds, one thing to understand about criminal discovery in the u.s., just like criminal law generally, is there is tremendous diversity of systems.
12:08 pm
we have at least 51 criminal justice systems in the u.s., given the states and federal government. so if you look across of those systems, the federal criminal discovery is among the most restricted. rule 16 is limited in terms of what information the government is required to disclose to the defense. there are a number of things that have gone quite in the opposite direction, new york most recently, but my own state of taxes -- texas has moved into an open file discovery system, and under that system the defense typically would have access to essentially the entirety of the nonprivileged investigative file, would know the steps taken by the investigators, regardless if the prosecution was going to present that evidence, regardless if it is going to be favorable to the defense. in addition to the points mr.
12:09 pm
tolman was making about regulations you would want to have on how evidence can be used, in particular thinking about the federal system, thinking about opening discovery more with regard to facial recognition technology is really an essential part of achieving transparency and really vetting the reliability of this. >> thank you. professor friedman, should congress regulate private and government owned systems? i guess i know that the report shows, as others have noted, that facial recognition systems owned by the government, as well as those owned by private companies and used by government actors, are not always regulated. does ownership play a role in how these systems are used? >> thank you for the question. i think it is essential to have
12:10 pm
regulation. i think you are hearing all of us say that there should be regulation. the likeliest way for congress to regulate is through the vendors, the manufacturers, those who are taking them into commerce. you want to regulate the use of those technologies by law enforcement agencies, although there has been discussion about trust between communities and law enforcement. but you will not see that until there's regulation for how the technologies are used with a strong guard rails. i could say more but i see that your time is expired. >> thank you. i yield back the balance of my time, which there is none. >> i now recognize ms. tiffany for five minutes. >> thank you very much. mr. frederick, i want to shame -- frame my question with the background of the biden administration.
12:11 pm
it has been reported to be coming out that they are going to enlist the wireless phone carriers to vet people's text messages. i mentioned that because some of my constituents believe that we are a short leap to the communist chinese party's use of technology in their country, where they have created a social credit system. are my constituents right to be concerned about where we are at in the united states? >> well, i will say somebody who has worked in the intelligence community for almost a decade, i think that your constituents are right to be concerned about where the technology takes us. what i think is disconcerting is what you mentioned, sms content being parsed, podcasts
12:12 pm
have been mentioned as having a portion of unfettered conversations, and these are bastions of our free speech. and i think it is of concern to see where authoritarian governments are sort of taking us. again, we do have a system of guardrails, but it is important, especially for conservatives and those expressing concern to views to be vigilant and see where this technology goes and guard against it. i think we are all advocates for guards here, in most respect. >> dr. alexander, you mentioned the state of maine, in particular portland, maine, and we also heard from mr. lee that certain communities have not been involved. isn't there a role for city councils, state governments, those governments at the local level to get involved with this, too? because they are responsible for their police forces in all
12:13 pm
instances, correct? dr. alexander: -- >> maybe you are muted. dr. alexander: yes, you are absolutely right, because there at the local level is where it all starts. and a local elected official, mayors, city managers, county managers, whatever the case may be, they certainly need a voice in this, because what you want to always make sure of is that your police agencies are operating at the highest level of training and certification. and we do not have that yet across the country, that is why you find some communities backing off of this technology until more concrete findings are done. rep. tiffany: i think about local officials, i have many constituents that work in the
12:14 pm
twin cities, and we saw a real failure on the part of the mayor of minneapolis and the city council in addressing the riots that overwhelmed minneapolis last year. a lot of it is because they did not take control of their police department. there were some bad cops and they needed to take care of them, and they did not do it. this is the same thing, they need to address the technology. and of this is just minneapolis, there are many cities and states across the country. they have their purview and obligation to oversee their police departments, and they should do that, including in regards to using this type of technology. the final thing before i yield. this is an example of the limits of technology. i think about simple things like with children, they get better reading retention when they read off of paper versus off an electronic format. and i think about the police the
12:15 pm
same way, you cannot replace boots on the ground and good investigative work with technology in all instances. dr. alexander: that is correct. >> my question for you, mr. tolm an, u.s. capitol police have been given army surveillance equipment to start monitoring americans. and we have reports from a group called surveillance technology oversight project that says the big -- they think that represents an expansion of police surveillance. i have a few seconds, then i will ask a question to you. they point out the expansive nature of the surveillance state. how do you see this role of actually using military surveillance, given -- giving the equipment to law enforcement? how does that expand a surveillance state? mr. tolman: in this instance it
12:16 pm
is of grave concern because right now the capitol police are exempt because of the freedom of information act, so not only would it be difficult under the rules of discovery for the defense counsel, you also would not be able to get transparency. it creates a very powerful surveillance state in an agency that probably is unaccustomed and does not have the history of being, you know, surveyors of u.s. citizens. so it is a recipe for disaster. >> thank you. my time is expired, mr. tiffany. >> i will yield back, but one final thing. i think we should put in the mix here what is happening with the text messages that they are talking about, enlisting phone companies to intercept sms messages. i think that is a very chilling thing and it should not be done by the biden administration, and we should throw that into the mix in dealing with this issue as we go forward.
12:17 pm
thank you, madame chairman. >> thank you. i know these issues reach beyond administrations and these are important questions of this committee is bringing forward. it is my pleasure to yield to the gentlelady from pennsylvania for five minutes. >> thank you. thank you for convening this hearing on innovative technology and its appropriate use and regulation under the law. and i also thank all of our test of fires and advocates -- testifiers and advocates today. mr. lee, companies claim that they can instantaneously identify people with unprecedented accuracy, yet mr. williams's story is not unique. such technology has resulted, as you know, in at least two false arrests in michigan and one in new jersey. some organizations have
12:18 pm
expressed concern around the risk of that clearview's technology may have on undocumented immigrants, communities of color, survivors of domestic violence and other vulnerable people in communities. can you share with us some of the dangers of this facial recognition and its misuse? mr. lee: absolutely. thank you for the question. the issues, there are a number of civil liberties concerns with private access to law enforcement. also with law enforcement's ability to purchase private data. there is little to no transparency, as we said. i think that a fundamental issue we need to talk about is that these technologies are not accurate. and i think that we are having a conversation as if they are. but they have significant error rates across the spectrum, especially from marginalized communities. with little oversight, with little guidance, with no
12:19 pm
community input, and no meaningful ways to engage in checks and balances, in a conversation around the use of facial recognition technology in the criminal, legal process is it something that the leadership conference and many organizations are pushing back against. we should be think about a way to envision a law enforcement and community engagement with law enforcement that does not require surveillance. nor constant over policing. that is something to push forward for. rep. dean: thank you for that. transparency, recognition that this is not full proof technology, and the error rate. there's more to learn. mr. williams, of course we are all moved by your story. your daughter watched as you were arrested for a crime you did not commit. how did your arrest, and detain meant, affect -- detainment,
12:20 pm
affect your family and community? mr. williams: thank you for the question. so, the -- it mainly affected my older daughter. she other daughter was only two, so she did not know what was going on. but she cried, too. she cried because her sister was crying. like i said earlier, i was trying to tell them -- i know we are short on time, but we thought about taking her to see a psychiatrist because every time she sees the story on tv, it makes her very emotional and she starts crying and we have to turn the tv off. and i am like, i am not going to jail every time you see it on tv, baby, they are just reporting the story.
12:21 pm
i wrote them down, somebody else asked the question. but the police actually did investigate it. and with all of those eyeballs on the pictures, nobody said these two guys do not look alike. i do not know how the human part goes into it but -- >> do you have an explanation for that, did they give you an explanation? >> nobody told me anything. and i cannot say forced, but we asked over and over again for a proper, um, explanation or a way to say it was some type of -- i do not know the word for it, but they just did not seem like they were sorry. they didn't want to be apologetic about it. they were like, this happens. we made a mistake.
12:22 pm
and i am like, you know, that doesn't do anything for me because all i did was come home from work and get arrested. and then there was other stuff like, i did not get dinner that night. they had already served food at the prison. so, i got to go sit in a cell with no food, no drinking water, and, you know, it was terrible for somebody. i was not ready for it. rep. dean: thank you for sharing that. i asked about your children, but we have to realize what impact it has had on you, the long-term impact on you to come home after work and be arrested and detained so inhumanely and incorrectly. i see my time is done. mr. williams: i do not know if
12:23 pm
you can hear, i have a small speech impediment at this moment, but last year i also suffered some strokes. they are looking into that to see what actually caused the strokes. i have no idea at this point, all they can say -- i have one doctor say it could be educated to stress -- could be attributed to stress. if he has a way to prove it, i do not know. but i did have strokes last year after. rep. dean: god bless you. i yield back. >> thank you. mr. williams's story should inspire all of us and others to find a bipartisan pathway forward. let me now yield five minutes to mr. massey. rep. massey: thank you for
12:24 pm
holding this very important hearing. before i jump into facial recognition, i want to talk about name recognition. imagine a system, a computerized system that instead of basing the adjudication of your civil rights on the distance between your eyes, or the shape of your nose, or where your ears are, it's based on the spelling of your first and last name and your birthday, and whether you might share a name with somebody who has been convicted of a felony or could be imprisoned today. such a system exists, and there is no judge or jury involved. it is called the nicks background check system. and the troubling thing about facial recognition is is there may be racial disparities in it, but we know that there are racial disparities in the instant background check system, the system you have to go through in order to purchase a firearm. my friend john watt, who
12:25 pm
recently worked at the doj, has found evidence that this system, by a factor of 3 to 1, produces false denials for black males compared to white males, because they may share a similar name to somebody incarcerated within their ethnic group. and so, ms. goodwin, i appreciate the good work that the jlo does. i served on the oversight committee and we covered this topic before there, facial recognition. the gao released a report that said there were 100,000 denials based on instant background checks and only 12 federal prosecutions. about 100,000 people committed a crime by trying to buy a firearm. what we know from that is they actually did not commit a crime, but there were a lot of false positives. i got the commitment of the fbi
12:26 pm
director to look into this issue about whether there are racial disparities built into the instant background check system. it's not very technologically advanced at all. he said he would work on it, but if you have an opportunity, you might look into it. is this something that you're part of the geo would examine? >> actually, that was my report. i worked on that report mother prohibited purchases report. it so we are happy to have a conversation about additional work we can do in that area. rep. massie: thank you for that report. they do collect race information on the form, and so if we can just take a another level, it would have to be as deep and thorough as the report that you did already, but it would be helpful, i think. and i noticed that the beginning
12:27 pm
of this recent report on facial recognition started with, or your testimony, "we surveyed 42 federal agencies with law enforcement employees, and i thought, whoa -- that to me is troubling, that we have 42 federal agencies that have law enforcement. maybe this is something the democrats and republicans can get together on, may be we have too many police forces at the federal level. but moving on, i wanted to ask one of our professors here, maybe professor lauren, about grading material. possible exploratory evidence that's produced that the prosecution may not want to share with the defense. you mentioned the way to get around some of the typical safeguards is they say, oh, we
12:28 pm
just used it to identify the suspect, we are not using it for the conviction. can you talk about why that is problematic and whether defense attorneys can get that material from facial recognition companies? >> thank you for the question. you know, in terms of -- i will start with the last piece, which is getting material from companies. in other words, there may be instances where assuming we are in a world we do not live in yet, where facial recognition is introduced in evidence in a criminal case. in those instances, the defense would certainly want to interrogate, right, what is inside the black box that generated this match. one of the complicating features here is the proprietary nature of many of these technologies. and so we have not seen this yet
12:29 pm
so much in this context, but we have seen it in the context of dna with certain more novel analyses of low copy dna, where the defense has been denied the ability to access the source code, access what is claimed by companies to be proprietary information protected by trade secrets, and that intellectual property law has been invoked to quash defense subpoenas. so one of the complicating features, one of the many limits on the ability of the defense to vet this in a criminal context is this proprietary dynamic. layered on top of that are a number of complications that derive from brady doctrine and sort of disagreement over the scope of brady and deciding when defendants are entitled to
12:30 pm
this material versus after-the-fact. i'm happy to t pause -- to pause there. >> my time is expired, but i hope that they weigh in on this, because it does seem to be a problem, not just within facial recognition, but across all of the domains. i yield back. >> i now recognize the gentleman who has dealt with at some of these tech issues from a different perspective, the gentleman from rhode island. you're recognized for five minutes. >> thank you to the witnesses if we your compelling testimony. i want to begin with mr. williams. i'm wondering -- first of all, thank you for being here to talk about these issues in terms of pablo policy -- public policy. it is good to be reminded that this impacts people's lives and
12:31 pm
your story is one example of that. did it seem like the police had even bothered to conduct an actual investigation, or did they appeared to be solely relying on what the computer told them about your arrest? mr. williams: thank you for the question. so, they did provide information. i did not find out any of this until after, when we went to court, but they had a lineup of, i guess, six guys. and we -- none of us look alike. -- looked alike. i guess that somebody picked me and said, this is the guy right here. like, i look nothing like the other guys. i was actually about 10 years
12:32 pm
older than all the other guys in the picture. and they did not use my current license photo, they used one from six years ago. >> thank you. are there specific privacy concerns related to data breach's of facial recognition systems that are more concerning than breaches of other systems used by the federal government? >> yes, thank you for the question. yes there are. as i mentioned, your face is permanent, distinctive. if that information is breached, it is more problematic and concerning than if your password is breached, because you can change your password, but for the most part you cannot change your face. so there are major privacy concerns about what it looks like when the technology, when that information has been breached. >> facial recognition systems
12:33 pm
have varying levels of accuracy, but we know that they are ineffective in identifying people accurately from communities of color and women. is there even any training that might help law enforcement overcome these inaccuracies, or are they so embedded in the system that we should ban facial recognition outright? >> thank you for the question. i think we need to take about a context of policing that does not involve surveillance or overly surveilling marginalized communities and over policing them. i would highlight that one of the things to think about is the technology does not, if the technology is not surveilling or if they technology is not working to the same communities it's trying to help, is this technology worth using in the first place? so that is something that we need to think about,
12:34 pm
particularly within the context of civil rights concerns. because, again, mr. williams was wrongfully arrested, wrongfully detained for a crime he did not commit, based off of a line up and set up and use of facial recognition technology that was not authorized by the local government, not known by mr. williams himself, was not understood under any circumstances by the people using it, and had varying levels of accuracy across the board, not given to mr. williams at the time, or to the law enforcement. >> i want to get one more question. so building on what was asked about earlier, if we could require prosecutors to inform a defendant that facial recognition technology was used to identify them in any part of the process, that is one reform. in addition, we could require that the prosecutors disclose any other exculpatory,
12:35 pm
statutorily --. there are things we could do to at least make the process more transparent and hopefully, give the defendant an opportunity to challenge this technology. i agree -- >> i agree, there are other items that could be added to the extent that the system to report out confidence levels, that's something that seems necessary to disclose, but i also think this issue of being able to get at that black box and actually know what is driving the results, which then involves questions around intellectual property, is really and important issue to get at. >> i want to say because the chair recognizes that part of this is facilitated by the monopoly power of these large technology platforms, the fact they are unregulated, they are collecting an enormous amount of surveillance, and the danger of integrating all of this presents real challenges, so it also
12:36 pm
requires us to regulate how private companies can provide this technology to law enforcement, and larger questions about how to rain and big tech -- rein in big tech. i look forward to more details on that. thank you. >> your time has expired. thank you, we have other members yet to ask questions, so we will certainly move so you can be included. it's now time to recognize mr. fitzgerald for five minutes. rep. fitzgerald: thank you. this is fascinating stuff this morning. i just wanted to move in a direction away from law enforcement to just the private sector, where some of the members touched on this morning. as we see iphones that use facial recognition, you know,
12:37 pm
anytime you are going through the airport now you see the clear system, which is allowing access for individuals in airports on a regular basis. and some employers now that are using this in relationship to their employees, under the auspices of security. so there is a lot of different things happening in the private sector. the question i have for ms. frederick and mr. tolman, can you address some of the privacy concerns of private sector use, particularly related to the difference between using facial recognition technology for verification purposes rather than just flat out identification? ms. frederick: ok, i will start here. i think generally that it is
12:38 pm
that outsourcing of surveillance that is more potential concrete violations of the fourth amendment. i'm not a lawyer, but when i see the surveillance state basically outsourcing this job the private companies that do not fall under the constitution, that is what starts to worry me as a citizen. somebody who has been behind the powerful technologies employed at a commercial level, also by the surveillance state overseas, to me i think it is that outsourcing to private companies that we need to think about when it comes to the constitution because some reports indicate that that is expressly may discard constitutional protections. -- made to skirt constitutional protections. mr. tolman: that last point is where i wanted to go, often times at the use of third-party providers is a runaround of
12:39 pm
the protections. we've already seen one core indicate that the boston police department violated the fourth amendment in its use of facial recognition technology. that would be one pressure point to push you towards a utilization of a third-party provider. you know, at one end, where you have somewhat lower standards when utilizing the evidence from them. and we do not know the nature of some of those partnerships, we do not know -- that's the problem, the lack of transparency and we do not know what it is, sometimes even the questions we need to ask in order to find out how pervasive that relationship is. >> thank you. great answers. the other question i would have for both of you would be, there has been discussion about a -- and what we are seeing across the nation is some new
12:40 pm
municipalities, through ordinance, have been progressively addressing this issue as it emerges, either as part of something that is under their control -- many times it is, of a municipal court system. and they are moving forward. city council and even some county boards are moving forward. is it practical at this point? it does not seem like it is to me at this point. mr. tolman: i would quickly say that it is hard to imagine that this powerful tool is not to be utilized at all. i understand the calls for a b an, but that is what makes this difficult, there are circumstances in which this powerful technology may be the only thing that may save a victim or save a town from some horrific crime or from a terrorist attack, for example,
12:41 pm
so getting this right in how we authorize law enforcement to use it is right now the most important job, i think, that this congress has. ms. frederick: i prefer to think of it more as a tactical pause, take the time to get it right. so i.t. that there are measures with which we could do so. a lawsuit are being fought in the court. illinois has filed a lawsuit against the chicago police. there are other mechanisms, as well as municipalities, that can take the time to get it right. rep. fitzgerald: thank you. >> i thank the gentleman. i'm pleased to recognize the gentleman from california. you are recognized for five minutes. >> thank you. i'm super excited that you called this hearing.
12:42 pm
my office and i have been work and facial recognition for over two years. the reason it has taken so long is the chairwoman has aptly stated that it is a can of worms. it can be very complicated. we are working on legislation that is essentially complete now, it basically requires a warrant for most cases for the use of facial recognition technology, it sets auditing standards and requires recording. it prohibits the technology for most -- protected activities and in number of other provisions. we will circulate that legislation to the members of this committee for your review and comment. so, thank you for this hearing. my first question goes to professor laurin. as we know from the hearing, the technology implicates the fourth and for the amendments. but i want to talk about the equal protection clause, because
12:43 pm
there is a disparity in accuracy in terms of the color of your skin. whether also be an equal protection violation if we had governments deploy this annoying that -- knowing that people are being treated differently in terms of surveillance? mr. laurin: i think that in answering the question i want to sort of distinguish between constitutional doctrine, as it stands, and maybe a more intuitive sense of what is rightfully understood to be constitutionally problematic. i think the challenge with using the equal protection clause and doctrine around it, to get at these issues of bias in facial recognition, is existing supreme court doctrine requires that essentially discriminatory intent be proved in order for
12:44 pm
state action to be deemed a violation of the equal protection clause. and that presents a hurdle when one is working with evidence of disparate impact, right? there is this evidentiary area gap often in getting from -- it realizes there is a disparity to the state wants to discriminate on the bases of -- basis of race or sex. that has been the difficulty in using the courts to get at these problems of disparity. that does not mean it would not be understandably appropriate for congress to say, we see a problem in these disparities and we want to legislate to require that the disparities in themselves be sufficient basis for limiting these technologies, etc. but one needs that they can about what the hook would be further that in terms of section
12:45 pm
ive, or the -- five, or the commerce clause. but the existing doctrine presents a challenge because of the requirement of intent. >> thank you. professor friedman, first of all let me step back and say that i agree with what dr. alexander and mr. tolman said, that we cannot just ban this technology. you look at human history, it is nearly impossible to ban technology, but we can regulate it. because of so many different iterations and possible uses and situations for facial recognition technology, what do you think about imposing a warrant requirement that can take into account different factors before deciding whether in any specific case we would use a facial recognition technology? >> thank you for the question. i want to make it, about the ban
12:46 pm
versus a strategic cause point. i think everybody is wreck and i say that we have a car in front of the horse, that is the problem, the regulation is the only shot at getting the horse back out in front. more requirements are useful in some circumstances, not others. it depends on the use case. for example, if the use case is identification, which is what most agencies are doing right now, they take a photo and compare it to a database, a war rant requirement makes sense, but only because there is a reasonable belief is you have a photo of some but he suspected of a crime permitted under the statute. i would only allow this for very serious crimes. that's a use of a warrant that makes sense, but the warrant cannot tell you that a particular individual did the crime, it is just the
12:47 pm
technology. >> i yield back. >> it's -- i take we have present with us the gentleman from utah, mr. burgess, he was recognized for five minutes. rep. burgess: thank you, thank you for holding this important hearing today concerning facial recognition. it demands a bipartisan solution and i look order to working with colleagues to address their concerns. i want to thank mr. williams today, we agree on the unfairness of what you and your family entered. thank you for doing your best so that no other american has to experience what you are going through. and i want to give a special welcome to my fellow utah resident, mr. tolman, who has
12:48 pm
been a strong advocate of criminal justice reform. and towards the end of the questioning, i'm finding that the downside -- is most of the questions have been answered, so i will yield my time back to the ranking member, but before i do and want to highlight what was said previously, the bipartisan concern i'm seeing as we go through the topic. and i also want to point out, as we look at solving the increase of crime that we see everywhere, that we cannot rely on technology. it is a tool to support human relationships, and human relationships should be based on respect. we cannot continue to demean and disrespect the good men and women policing our communities and expect technology to make up for it. i hope this committee will address the concerns of facial recognition technology, let's
12:49 pm
take the same passion to bring back the respect for are law enforcement officers, the men and women, the same respect i had for them that i experienced as a child. we will have more confidence in them once we have that accomplished. i will yield back my remaining time to mr. biggs. rep. biggs: a couple of quick points. this has been a very enlightening hearing. we have not really addressed my opinion enough, the question on a reasonable expectation of privacy. that's been shattered by the data scraping that goes on by private companies. i hope that we can expand our investigation into that, because i do not think that the question -- uh -- has been settled through the courts yet, either. if it has, i have not seen it,
12:50 pm
but we certainly need to get to that. i also want to comment on the legislation from mr. lieu. i look forward to reading it and i appreciate your efforts on it, and i hope we can integrate the work done in the previous iteration of the ogr committee with the chairman cummings. so i look forward to seeing that and hope you can get it to us soon, mr. lieu. and i want to ask mr. tolman this question because it goes to the heart of what we are talking about -- i mean, as said, this is a can of worms when you open up. when we start bringing together other technologies, we have discussed this with sms, and we use this from mere identification, which in itself is a problem, to potentially track movement, this notion of science-fiction type pre-crime,
12:51 pm
stopping pre-crime -- what's your take on that in a brief way, that we are on a spectrum that will keep moving to trying to stop people from committing crimes before a crime is committed by using this technology and others combined? >> i share the concern. when we have seen law enforcement given additional technology, their first instinct is to utilize it because they have a sincere desire to work on behalf of our communities and the victims of crime. however, they are often the last ones to ask the question, should we utilize this technology. that has to be the discussion of members of the legislature that identifies this. we are already seeing that speech in -- you know, social distancing under covid, in areas where we have not been are now
12:52 pm
areas where we are focusing law enforcement technology in action. because of that we are on a slippery slope and i would share your concern. rep. biggs: i hate to cut you off, but i want to finish where we started with the chair -- and the testimony from mr. williams. so, let me find it. mr. williams, in your written testimony you said, "i do not want anyone to walk away thinking that if only they technology was made more accurate the problems would be solved." i think the issues we have discussed are not just issues with the accuracy of the technology, i think we are talking about a total invasion of privacy, which is absolutely unacceptable, which at some point will give too much power to the state. and we have seen it already in your case, mr. williams, and in
12:53 pm
others. i am looking forward to working with the chair, and also mr. lieu, and others who have a sincere interest in resolving this issue. thank you. i yield back. >> you are absolutely right, we have a sincere interest in resolving this issue and if there are layer's of issues i think that we can respond to as a committee in a bipartisan manner. and thinking of bipartisanship, i'm delighted to yield to the gentleman from california, a former sheriff, for five minutes. >> thank you for holding this important hearing, and i want to thank all of the witness today for their testimony. mr. williams, i want to say that hearing your story, being in a cell for 30 hours and not knowing why or what you were accused of is alarming and
12:54 pm
tragic at many levels. as you think about the -- today, when you give your fingerprints, you give permission to have them taken. when you go for a scan, you say, here are my prints. when you give your dna, you give it with permission, knowing that somebody is going to do a dna scan on you. but when it comes to facial recognition, no consensus is necessary. in fact, most people do not even know that their information is being collected by somebody, a private entity out there. it was mentioned to 133 million americans. let me repeat that, 133 million americans have their facial recognition data in a database somewhere in this country. and ipo zoom that most -- and i
12:55 pm
presume most, if not all, give that without consent or even knowing that that data was taken from them. so, professor laurin and professor friedman, i have a question -- we discussed the constitution, our rights, but are there any regulations, laws that govern to limit in any way possible a private entity's ability to collect facial recognition data? as a taxpayer or citizen, do i have to accept the fact a private firm has my facial recognition data to sell for use as they wish? professor friedman: if i may? >> go ahead. professor friedman: i want to appreciate the question at a couple levels, because it is important. it is easy to fall into the
12:56 pm
language of constitutional rights, and those are critically important, but the interpretation of the constitution often does not touch the things we are talking about, in part because the technology is rushing ahead of the courts. you could not be more correct that there is an urgent need for legislation and regulation, as to private companies and law enforcement agencies. to the private companies, i do not know of any congressional law that deals with this. for the most part, there is not. and i think the need for legislation is essential. professor laurin: i can only echo what the professor said, he speaks my mind. i think there is a tendency, actually, to sort of overly prioritized the constitutional stuff and to miss the fact that
12:57 pm
actually the constitution is a floor, not a ceiling, in terms of protections. and often the constitutional protection is lower it design because the courts say there are details for congress to figure out and it is really not our institutional place. >> let me repeat my question. as a citizen, will i have to accept the fact that right now a private third party has my facial recognition information and they can do whatever they want to do with it, sell the information to whomever they want to sell it to without repercussions, without any regulation, without violating any kind of a long? -- a law? is that the state of the country today? professor laurin: that is an answer i can give you, i do
12:58 pm
think that there are at least some data privacy regimes, statutory regimes, that will limit summative days. but in the main there is little regulation currently, and certainly the constitution has virtually nothing to say about private, nonstate action. at that is why congressional action is so important. >> so i am left here with a very unsettling feeling that i essentially have no remedy for somebody using my facial data recognition information. for whatever they want to do with it. >> you can create one. congress has the power. you can change that state of affairs, if you chose to. >> thank you, madam chair. i ran out of time. i yield back. >> the individual ran out of
12:59 pm
time. we're also running out of time. this is such an important issue and i am grateful for all of my members. and i listened to, as i yield for a moment, i listened to professor friedman over and over about regulation, and that is what we are having this hearing for, because we are working on legislation, and i look forward to working with mr. biggs, but also with our colleagues who are likewise working on legislation to address the question about regulation or a construct that needs to address this question. so, thank you all very much. it is my pleasure to yield it to the vice chair of this subcommittee from missouri, the congresswoman bush for five minutes. >> thank you for convening this hearing. we as a committee have spent
1:00 pm
significant time addressing the broad scope of power of the agencies, from the fbi, to various prosecutorial entities under the permit of justice -- the department of justice, and time and time again, we are met with concrete examples of law-enforcement enforcement director says that not only undermine our civil liberties. the aclu conducted a study with members of congress, in a criminal database, and 48 congress members matched. imagine what it's like for everyday people across this country. what we know about this technology is the darker your skin tone, the more likely you will be falsely identified. following the murder of michael brown junior, i recall wearing bandanas as a response to the teargas.
1:01 pm
it was during that time that we realized we needed to wear bandanas because of the likelihood of facial recognition technology being used against road test movements and surveilling each and every one of us involved in the uprising. 14 agencies reportedly used facial recognition technology to support criminal investigations. yes or no, was it used by the federal law-enforcement disease following the torture and murder of george floyd? >> yes. >> what other collateral consequences of protesters being identified by these technologies -- what are the collateral consequences of protesters being identified by these technologies? >> we race to the concerns about inaccuracies. the fact that people were being misidentified by the technology and that could have adverse consequences.
1:02 pm
a person's face could be out there and used in a number of ways. we look at the lay of the land for how law-enforcement was using the technology. >> thank you. in 2016, it was reported police used facial recognition to surveilled crowds during the freddie gray protests without a warrant. is there anything to protect those identities while engaging in first amendment -- engaging in the first amendment? >> there are civil liberties concerns with the technology that we saw not only with the george floyd protests but also with lapd, using bring cameras to surveilled black lives matter protesters. we have serious concerns. there are serious civil rights
1:03 pm
and civil liberties concerns about the tracking of protesters. especially in these circumstances and especially when the technology is not well used or very accurate. >> mr. lee, how does the use of the facial recognition technology -- how does that persuade the opinion of law enforcement and high tech? >> there's a tenuous relationship between communities that are over policed and over surveilled by law enforcement. we have seen that time and time agai. congress needs to keep that in mind when regulating facial recognition technologies. these are tools used by law enforcement to disproportionately target black and brown people across the nation. we need to keep that in mind. the civil liberties of those people in mind, as we think of what the next steps are.
1:04 pm
we think it kills protests and speech. it does not work and miss identifies. >> thank you so much, mr. lee. there are local solutions that are being [inaudible] [no audio] >> are we able to get ms. bush
1:05 pm
connected? let me -- i think we may have lost congresswoman busch. thank you very much, congresswoman, for those insightful -- very insightful points that you have raised. i have been impressed by the direction of our questions, the depth of our questions, and certainly the importance of our witnesses. i have questions i'm going to utilize at this time, but i'm happy to yield. you have been rewarded with time. i am happy to yield to you before i make my final
1:06 pm
inquiries, that i hope will be important to the record of this hearing and proceeding. >> thank you, madam chair. i know you will be surprised to hear this. you know i have many more questions and comments about this. i appreciate those who have participated. but i'm going to limit my comments right there. again, thank the members, the witnesses, and yourself. thank you, madam chair. i yield back. >> thank you very much. and thank you to the witnesses. i am going to proceed with some questions that i think -- that i did not proceed with. as i do so, i want to make sure that i thank all of the witnesses again. you will probably hear me thank the witness has more than once. i would like to submit into the record, then i will begin
1:07 pm
some questions that i think a very important, i want to submit into the record the geo report that i believe was entitled "facial recognition technology, federal law enforcement agencies should have better awareness of systems used by employees." i think that is shocking, and i think that is also indicative of what i think dr. alexander made mention of, and terms of the numbers of law enforcement and the utilization of this tool without training. i will, mr. lee, come to you for a broad question on civil rights. but let me ask dr. alexander, 18,000 police departments, one of the reasons why -- to find regular order that will help
1:08 pm
police not diminish police, how do you respond to the idea that -- and also, to my members, i believe we should have a follow-up hearing with some of our federal law enforcement representatives cited by this excellent report put forth. that may be in a classified manner. we will be presenting that in the future. dr. alexander, to note that it was determined through the federal system that employees were using this system without their hierarchy knowing, what do you think that means for the 18,000 police departments, if this technology comes into their possession, and how that relates to your plan about police/community relationships,
1:09 pm
and mr. williams and his family, the victim, that he can be made whole. >> thank you, madam chair. yeah, there are 18,000 police departments in this country, and often times, we are finding more and more, we've got 18,000 different ways things are being done. and how they are being responded to, how they are being disciplined, how they are being managed, how they are being drained, etc. when you have one organization, and all it takes is just one, quite frankly, the american public see policing the same -- doesn't make it right? no, but that's the reality of it. if i do something horrible, we'll have to take that hit, too. dear question, it becomes important to recognize that -- to your question, it becomes
1:10 pm
important to recognize that training will have to be regulated at a federal level and there has to be some federal standard. the same way we do with fingerprints, with dna, with other forensic methods, that have proven to be of value in the law-enforcement and public safety community. because the whole idea about this particular piece of technology is that it is very convoluted, and it is very complicated, and there are going to be cases that are going to go before the courts that we have not even thought about yet, as it relates to this technology, still yet to be examined. i just think, congresswoman, that due diligence would be given to these local agencies across this country, if they have an opportunity to be well trained, if there is
1:11 pm
standardization for utilization of this technology, if the ongoing certification -- there be practices held to the highest standards of science. in addition to that, i think it becomes important as well, too, that the public understands what facial recognition technology is. because we've talked about it a lot within the context of policing and within congress, but a lot of people in our communities across this country do not understand exactly what it means. and when they do here about the cases, they hear about the cases that are often associated with negatives, mr. william. >> you don't have to answer, i'm going to make this based upon your response -- minimally, police departments need to be aware of when the technology is in their possession, -- in
1:12 pm
their possession, and the individuals given the authority of using it, i would like to play set on the record, because the title of the report said that our federal agencies should have better awareness of these being used by employees. that is shocking and amazing to me. i think that can be laid out into the field with the other 18,000 police apartments. but thank you for your answer. i indicated my concern has been the unreasonable search and seizure. i think this was raised by one of our members. i would like ss sink from both of you. -- a succinct answer from both of you. i don't think we are without tools. we have the ability to look at the federal criminal procedure as we do federal procedure, but as a former u.s. attorney, in
1:13 pm
cases where an investigation result in prosecution, should prosecutors be required to inform the defendant that facial recognition technology was used toy identify them in any part of the progress -- to identify them in any part of the progress? it should be investigated and should positive matches be required to be produced, as that investigation goes forward? if it ultimately comes into court. i know one of our members mentioned the brady difference. how should we respond to that? >> thank you, chairwoman. it is absolutely right, the observation that with respect to federal criminal procedure, there are some very direct interventions that congress can make. while the scope of rule 16 frankly is always sort of a frock discussion, i think there are important limited
1:14 pm
interventions that can be made in terms of requiring access to facial recognition. the fact of the search, to alternative matches, to information on confidence levels. as i said, i think to the extent possible, it is essential the defense has access to the algorithms, to what is inside the black box to be able to evaluate the reliability of the technology. and that can all be a matter of statutory change, without having to worry about the court sorting through it encompasses that -- through whether it encompasses that. >> the professor is absolutely correct. i agree with everything she just said. i would add to that, unlike a lineup, where a prosecutor is required to hand over
1:15 pm
alternative individuals who might been placed in the lineup, and the rationale for the lineup that was utilized, facial recognition technology requires the prosecutor and should require the prosecutor to turn over -- or if to give the defense counsel everything it needs to assess the reliability and admissibility of such evidence. we are in a new ballgame with this technology. it will require more to be given to a defense attorney or prosecutor under constitutional protection -- the strictures of constitutional protection. >> it is essential that timing be addressed. because if defense counsel is only able to access this on the eve of trial, it does no good in terms of evaluating whether a guilty plea is not appropriate -- no good in admissibility of evidence. early discovery is important as well. >> early discovery and early knowledge, is that my understanding?
1:16 pm
thank you so very much. let me quickly -- professor friedman, i want to clarify, when you said regulation, and what framework are you speaking? >> i think what is important here is comprehensive regulation. that does to general things -- regulated technology -- two general things, regulate the technology, and the other is, there's a lot we need to learn. there's a lot of things congress can do immediately to actually test this technology under operational conditions, how law-enforcement is using it. because there are very serious concerns about accuracy and bias, particularly racial bias. we get numbers reported, but those numbers are no reflection to how law-enforcement is actually using the technology and what little we know from this suggests those error rates may be quite high. there's a list of things that i
1:17 pm
think congress should studied. there are a list of immediate things we could put in place. for example, best practice protocols about what it means to have a human in the loop to verify what the algorithm does. >> thank you very much. i'm going to finish with mr. lee. you have been keen on the providers -- or, the tech entities that control this technology. from your perspective, what would be a regulation, in that entity, in that arena? >> i work for the heritage foundation. we do not really support regulation or robust regulation when it comes to private entities. my solution would be we have to encourage these companies, the leaders in these companies to in situ those privacy by design
1:18 pm
mechanisms. -- institute those privacy by design mechanisms. privacy protections. we should have methods of encryption, which are getting more powerful and better. decentralized models of storage. machine learning. congress can sort of impress upon these companies that they need to build their data privacy protection. i think that is a good start point, short of actual regulation on something. >> in the spirit of my ranking member, you know this is a bipartisan hearing, we wanted to make sure we welcome your boys, as well. which will be very helpful. let me have mr. williams, then mr. lee, you have the last word. mr. williams, i think you have heard from all of us, we are appalled that happened to you. we are stunned. but we know that there are good
1:19 pm
intentions, i will be introducing some article, where this technology was used. we also heard from dr. alexander, that the policing out there, working to do positive things. you fell into a very bad tribe. what do you want to leave us with -- bad trap. what do you want to leave us with, not as a victim of yourself, but as a victim of this technology, that should not have happened? i will start by apologizing to you and your family, and will be eagerly watching your case, possibly pursued by the lcl you, if i am correct -- the aclu, if i am correct. what do you want in your testimony here? you are on mute. >> thank you. first of all, i want to say thank you for inviting me to come and tell my story.
1:20 pm
and i hope that going forward, we can figure out a way to -- like they said, to regulate it. it is already being used, so we need to get it fixed so that it works in a way that it works properly. because in the current state, it is wrong. it is getting it wrong, and you want to get it right. i don't if that makes sense. like, you don't really have a [indiscernible] nobody is ever signed up to say, hey, they used my picture from my id. like i said earlier, they didn't
1:21 pm
even use my current driver's license, they used one previous, from what i might've looked younger at that point. i don't know. i guess, all the information is in the system, so it is available to use. i was asked a few times about my daughter, i don't know what the lesson of the fact is -- all i know is at this point, she is upset. i don't know. i'm just -- i hate to say i'm thankful there was a felony larceny. but it -- what if the crime was capital murder or something like that? and then they came to my house
1:22 pm
and arrested me? i don't know if i would've gotten a bond to get out and be free to even talk to you at this point. because the court system is so -- is so backed up, they probably wouldn't even have gotten to me yet and i would still be locked up for something i did not do. right? i mean, i kind of cringe to think about it like that, because i did not get to say anything. i didn't have any rights read to me or anything. they showed up, saying my name. and i asked for a warrant. they didn't bring one. i kind of resisted arrest, but i was in my driveway. i was like, what are you all doing?
1:23 pm
he was let come are mr. williams? -- he was like, are you mr. williams? you are under arrest. i was like, for what? he was like, i can't tell you. you can't to me what i am under arrest for, but -- it took me immediately down to my detention center and i got fingerprinted and mug shotted right then and there. my fingerprints, i thought, would be clear, and i would be out. but that didn't mean anything. my fingerprints went in, so what? now they have a mug shot to add to it. it was so backwards, if that makes sense. i just feel like i am a regular. i feel like i knew the law. and i was wrong. >> mr. williams, i would just simply say you are speaking
1:24 pm
justice -- seeking justice. and you deserve it, you really do. the families who have been fighting these issues are dealing with the criminal justice system. having listened to mr. williams, i know the position of the moratorium, but just, if you would, end us on the enormity of the problem, if we were to have facial recognition used randomly in this massive entity of 18,000 police departments. what can happen to a person who was lined up like mr. williams ? i am shocked that you can't get a reason why you are being arrested. you are entitled to that.
1:25 pm
but the used facial recognition. it didn't look like him when they put the picture up to him. so how is that undermining of civil rights and civil liberties in this country, if used as it was used with mr. williams? >> adam gerol, yes, -- madam chair, yes, i think there is a broader context of policing we must keep in mind which is instructive in talking about the case of mr. williams, but also why many in the leadership congress in this coalition support a ban moratorium. we can't separate the tools from the context of the american policing system. we have evidence of that system, over decades, centuries, of how black bodies and
1:26 pm
disproportionately black people are treated within the criminal local justice system. we can't separate those tools. -- legal justice system. we can't separate those tools. with a tool biased against black people, women, miss classifying or miss identifying members of the lgbtq plus community, people with disability, so many members of the community we need to protect because of their protected status, and the history of harm that has been done on to their bodies throughout centuries, then we have to really have a real conversation about whether this technology actually works the way that we think it does. and it doesn't. that process time and time again -- that proves time and time again whether we are talking about the use of the facial
1:27 pm
recognition technology used in congress, communities that are over policed and over surveilled. and the communities not involved in the process. there's no public hearing about why it's being used. congress has not been asked or has legislated on these topics. this is compounded by the op city of policing practices in general. -- -- opacity of policing practices in general. >community engagement is not a checkbo. it is a continuous conversation. there are secret hearings, classified refunds, but the public does not know the full extent of facial recognition technology. we will continue to consistently
1:28 pm
call for a moratorium ban on these technologies and we would be more than happy to work with this committee and congress to imagine a solution and imagine a world of policing that puts the civil rights and civil liberties of marginalized communities first, not last. night using these same communities as basically test cases when these technologies do not work. >> ok. thank you so very much. thank you so very much. thank you to each of the witnesses for their profound testimony. and thank you to my members, or the members of this committee, and ranking member, thank you for joining me. on, i believe, a hearing that will be very constructive going forward, as evidenced by the questions and responses and the expert witnesses we had today. this concludes today's hearing. i would like to submit into the record the problem of
1:29 pm
bias and facial recognition -- the problem of bias in facial recognition. dated april 2, 2021. the washington post responding to the study. i want those articles submitted into the record. i think our witnesses for their attendance and participation -- i thank our witnesses for their attendance and participation. without objection, all members will have five legislative days to submit additional written questions for the witness or additional materials for the record. again, this hearing is adjourned. thank you. announcer: a senate subcommittee today will examine anticompetitive practices and prescription -- in prescription
1:30 pm
drug markets. watch live coverage on c-span at 2:30 p.m. eastern, on c-span.org, or on the c-span free radio lab. -- app. >> live coverage begins at noon eastern on c-span come online at c-span.org, or listen on the free c-span radio app. announcer: c-span is your unfiltered view of government, funded by these television companies and more. including mediacom. >> the world changed in an instant. but mediacom was ready. internet traffic soared, and we never closed down. schools and businesses want -- went virtual, and we powered a new reality. we are built to keep you ahead. >> mediacom supports c-span as a public service, along with these other television providers, giving you a front row seat to democracy. ♪

9 Views

info Stream Only

Uploaded by TV Archive on