Skip to main content

tv   Hearing on Digital Privacy  CSPAN  April 8, 2022 1:26am-3:48am EDT

1:26 am
agency for data protection. this virtual hearing is about 2:20. we're here today to confront one of the central challenges of our digital age as nearly every facet of our society and economy has moved online ever greater amounts of digital data spanning
1:27 am
who we are to what we like to who we communicate with to what we do often in minute detail. while this has some productive and valuable applications. privacy as it's conventionally understand means control over one's own personal information and the freedom to decide what to share, with whom and how. we're all familiar with the classic privacy harms whether it's government surveillance, identity theft and other economic crimes or the social and economic harm when some details become public against our will. however, in some ways privacy
1:28 am
has become a catch-all term for a larger and even more profound set of potential risk, some of which are truly novel or fundamentally different in the digital age. what does it mean as individuals when both the government and private companies can amass large profiles about each of us and use them to predict and even potentially manipulate our behavior? what does it mean for our economy when some of the most popular online products and services revolve around opaque data collection and personalization, and what does it mean for our democracy when these data practices shape the public discourse and the flow of news and information, and what does it mean that as individuals and internet users we have little or no meaningful ability or visibility into how any of this really works let alone genuine control over it. the american public has long been deeply concerned about
1:29 am
these questions. in 2020 the pew research center found half of americans had limited specifics on line activities out of privacy fears. another poll found that 88% are frustrated they don't have more control over personal information online. and i'll note, economic, racial, geographic and partisan. for all these reasons the time for a comprehensive federal policy framework is long overdue. many proposals and lots of talk but little or no action or clear progress. representatives have introduced the online privacy act. this sets forth the most thorough and aggressive set of privacy rights among the major proposals with the clearest restrictions on a wide variety of abusive and troubling data
1:30 am
practices and strongest mechanisms to enforce the law and make digital privacy rights more than an abstraction. it contains a set of privacy obligations on legislative branch agencies requiring the library of congress, the smithsonian and the chief administrative officer of the house to each identify concrete privacy risks in their operations and take remedial steps. many of the risks and data cut across public and private sectors. we can determine the best ways to address them. this can reinvigorate attention on digital privacy for major reforms. with that i'm pleased to
1:31 am
recognize congressman rodney davis for any comments that he would like to make. welcome, rodney. >> thank you for joining us for this hearing, a hearing on a rare topic but an important one. privacy discussions on creating nationwide privacy standards need to take place in the primary committees of jurisdiction. energy and commerce and judiciary. it's clear to me this discussion should be led by those jurisdictions and we should let them work out the policy details before we consider application to the legislative branch. 304 bills have been referred to this committee by the house. we are devoting our time today to this topic and it's rare we're even meeting our potential legislation this congress. we have had exactly one other hearing on the policy underlying a bill and exactly zero markups.
1:32 am
further, this committee has not held comprehensive standard oversight hearings in nearly three years. for us not even to have a discussion about fec or afc oversight hearings in an election year is shocking to me. i sent a letter to the chair asking for this committee to conduct oversight hearings with each of our agencies. i still have not received a response from the chair. i look forward to this committee having more oversight hearings soon to address the concerns that fall more directly within this committee's jurisdiction. regarding the topic at issue today, the call for a workable national framework increased transparency and overall accountability in the technology industry is all but universal at this point. data privacy is at the forefront because it affects everyone. we all see the need and as a parent of three young adults, i believe this issue not only
1:33 am
needs congress' attention but demands it. all anyone has to do is watch the news to see how data privacy affects everyone. last week filings by special counsel durham allege in 2016 democrat operatives, perhaps including mark elias, in his attempts to challenge the legitimacy of elections sought to exploit their access to nonpublic and/or pro-proi tear internet data to create tying president trump to russia, to infiltrate trump power and white house servers to link trump to russia. talk about an election subversion. unbelievable. so a responsible and workable nationwide solution for data privacy that can protect americans' privacy and also curb bad political actors from attempting to subvert our political discourse is
1:34 am
necessary. i know our chair has a special interest in this topic and i imagine her desire to raise the profile of her legislation is why we're here today. that's why we're here today. i spoke with kathy morris rogers since her committee would be primary and she offered a lot of good thoughts, things i assume she would say if this hearing were before her committee where it belongs first. we share more personal online now than ever before as the internet has grown, so has the opportunity our personal secretary, privacy does not end at state lines, this is a nationwide commerce issue that requires a congressional response. americans cannot rely on conflicting state laws to keep information safe.
1:35 am
we must ensure any congressional response is forward thinking. solutions that do not account for burdens placed on small businesses and innovators such as the ccpa in california are doomed to fail. finally, those who misuse information must be held accountable. there are proposals that call for a federal agency to enforce its provisions. the federal trade commission already has jurisdiction over these matters. as well as a long-standing relationship with the state's ags. creating a new agency to do the ftc's job is bureaucratic waste at the highest level. americans' call for a comprehensive solution to big data privacy issues is loud and clear. they deserve an answer, bought with transparency and accountability. even though this committee isn't
1:36 am
where we should be hosting this portion of this discussion, i'm looking forward to today's panels and i yield back. >> the gentleman yields back. i would note for the record the committee's jurisdiction is generally determined by house rules, specifically house rule ten and this committee along with other committees has been assigned jurisdiction. i want to thank the ranking member for his comments and i would like to recognize our first witness who is someone who has led the u.s. government publishing office with great distinction of late. he is the 28th person to lead the gpo since it opened its doors in 1861. and has served in this role in
1:37 am
2019. he was selected this month, the gpo was selected as one of the best employers in the country ranking first on "forbes" list of mid-sized agencies, and i do thank you that it is a reflection on your leadership as well as your fine staff at the gpo. as you know, this is not your first time to be a witness, your entire written statement will be made part of the record and we would ask that you summarize your testimony in about five minutes. you are now recognized for five minutes. >> thank you so much and thank you for acknowledging the achievement of making that "forbes" list. i'll take some credit if it's assigned but it belongs to the people here at gpo. thank you so much for having me
1:38 am
today. both the chair and to the ranking member and all of the members of the committee. i appreciate the opportunity to testify about gpo's approach to handling personally identifiable information commonly known as pii. on behalf of gpo's more than 1,500 crafts people and professionals know that we take our responsibility seriously to protect pii every day. gpo operates as a hybrid. it's a government agency that operates as a business complete with customers, products, and profit and loss. thus we face many of the same kinds of problems facing private sector firms when it comes to the handling of pii. gpo is entrusted with pii belonging to our teammates, customers, and by anyway tour of our business the general public. robust production of pii is
1:39 am
critical to building trust with our customers and stakeholders. without that trust we can never achieve our vision of an america informed. gpo operates a privacy program overseen by our privacy officer and our information technology business unit. this program establishes a framework for pii itself regardless of how it is stored. the program has several fundamental principles. first, access to pii is limited to only those agency teammates and contractors with a specific need. second, each business unit has someone who is responsible for the privacy function and answers to that business unit's leadership. third, any gpo teammate or contractor that suspects a breach in pii security is
1:40 am
obligated to report the problem as soon as possible. and, fourth, violations will be april dressed by appropriate corrective action including termination for our teammates, and criminal prosecution if appropriate. while my written testimony goes into more detail as to how this program works, today i would like to talk about three separate examples of privacy issues at gpo and how we deal with them. first, gpo and its contractors produce materials that require the use of pii. gpo security and intelligence document unit, manufacturers trusted traveler smart cards such as the global entry card used at border crossings for identification and for expedited processing.
1:41 am
gpo must handle vast amounts of highly sensitive pii within our systems. gpo works closely with its customer to maintain a series of firewalls that ensure gpo only receives encrypted pii that is only decrypted when it is needed to produce and distribute those cards. when gpo finishes production of the product, that pii is scrubbed from our system. it can't leak information you don't have. second, there are vast amounts of pii included in our publications such as the congressional record and federal register. the good news, particularly with the federal register, is that most of that pii is contained in historic volumes and very little is being published currently.
1:42 am
where it is, we have both automated and manual systems to redact that information from our digital collections when we find it. the bigger challenge is with historic information, particularly as we work to digitize older collections. for instance at one time it was common for the department of defense to include military officers' social security numbers on promotion lists later printed in the congressional record. as we digitize volumes that contain that information we scan for it and redact that pii in preparation before posting it on gov info or certified, trusted digital repository. and while we alert our partners when we find pii in tangible collections, we don't possess ourselves, we do not have the resources to visit every
1:43 am
collection across the country and manually redact that information. the third and final category are instances where the information is not quite pii but is information people don't want easily accessible on the internet. this comes up with our collection of u.s. court opinions on gov info, the largest collection of data. some parties of federal court cases may not want that information to be easily accessible and they often come to gpo asking us to remove that from our collection. it's important to remember gpo is only making that accessible and is not our data. we petition the owner of the data to have the opinion sealed
1:44 am
and removed from our systems if the judge determines the public disclosure is problematic. i hope that this overview of our operations was helpful in this critical issue. i look forward to answering any questions you may have and assisting the committee with its inquiry. thank you so much. >> thank you so much, director halpern. now is the time members can ask questions and we'll turn first to the ranking member, mr. davis, for five minutes of his questions. >> you'd think i would know how to work the mute button, madam chair. i apologize for the delay. hugh, great to see you, man. >> you, too. >> i looked the last time you were called to testify before this committee was march 3 of 2020. nearly two years ago. it's nice to finally have you back, but i wish it was to get a
1:45 am
comprehensive update on gpo's operations and the initiatives you've launched and successes as well as the agency's health post pandemic. unfortunately, we're here to talk about a topic the one the gpo clearly excels at and that's protecting the pii, personally identifying information, you spoke about. and it's worked to print passports, global entry cards and other sensitive projects. in your testimony you shared the impressive work gpo has undertaken to follow the recommendations laid out by the national institute of standards and technology and their guide to protecting the confidentiality of pii. do you believe you as director of gpo have the necessary authority to continue the high level of excellence in protecting pii? >> the short answer is at the moment i think we do. we've got the authority we need
1:46 am
to work with our customers to design systems to protect the pii they give us. and in our relationship with our own vendors, whether that's a contractor that we hire on behalf of the executive branch for work that they're doing for the irs or other government agencies. we are able to construct systems that protect that level of pii. one of the things that we have to remain really vigilant about is the pii of our own teammates and, frankly, if there was a large body of pii that the agency maintains, that's it. it's our payroll information, our time and attendance information. it's our personnel files. and we work both with folks here in the building and with our
1:47 am
vendors over at the department of agriculture who handle our payroll and other systems to make sure that information is protected. and we're doing our level best to protect the nearly 1,600 people who work here from running into any problems as well as the information of anybody who is working with gpo or working with our customers. >> great. how often is gpo's privacy program re-assessed or updated to remain at this high level of effectiveness? >> so it's constantly undergoing assessment. the directive that governs the privacy program is supposed to be reviewed, i believe, every three years. but we are constantly going through our policies, our procedures and our actual practices to make sure that we're doing our level best, and
1:48 am
that's both on the digital side where we have an extremely competent team on our information technology team. but, also, with our safety and security team. now i get an update three, four times a year as our safety and security team walks the building to make sure that things are shipshape and i can tell you they have flagged folks who have inadvertently left pii sitting on their desk not having it stored properly. we need to be proactive whether it's digitally or tangibly here in the real world. >> i wouldn't agree more. what has gpo learned from the breach that happened several years ago? >> similar to the lessons a lot of government agencies are learning. we've got to secure our systems
1:49 am
and we spend a lot of resources. we have a dedicated appropriation for that purpose and it requires a lot of vigilance on our part. knock on wood, thank god certainly during my tenure here we have not had anything that even indicated folks were close to a breach of those kinds of systems, and it's something we dedicate a lot of resources to every single day and the one thing that the opm breach, i think, taught all of us is you can't rest on your laurels. security needs to be everybody's job every single day. >> well, thanks, hugh. i yield back. >> i now recognize mr. raskin for his questions. >> thank you, madam chair, and for calling this important hearing.
1:50 am
ms. fitzgerald, discreet bits of personal information can seem relatively harmless or trivial but when they're all combined they can have a far more sweeping impact and create special dangers. what lessons does this point contain for privacy regulation? what are the biggest risks that congress should be considering in terms of the large scale aggregation of data? >> i would note that ms. fitzgerald is on our second panel, so -- >> oh, i'm sorry. i thought i saw her there. forgive me. i will come to mr. halpern. what are the greatest privacy risks that gpo faces in its operations? what are the specific harms in the area you are most focused on right now? >> as i was saying, i think the largest single trove we maintain
1:51 am
over a long -- over a longer period of time our employees' own data. and that's something we are focused very, very closely on protecting. most of the other places we have pii and we need to focus on that, we have developed systems to make sure that we are holding on to that pii for the minimum amount of time possible. so, for instance, in the example i used with the global entry card, we literally have a contractor sitting on the other side of the wall from our smart card manufacturing facility who will send the encrypted data that we'll need to produce those cards into the smart card room. we will utilize that information to produce those cards and then scrub that from our systems, send it back to the customer
1:52 am
once we're done with it so we're not holding on to any more data than we need. our biggest problem in terms of other folks' data is really historical data. and as we -- as we fulfill, i think, the greater public good of working backwards and trying to digitize old records, we're going to find more pii because, frankly, 20, 30 years ago, folks were a lot less conscious about what they were putting in a congressional hearing or the congressional record or the federal register. it was harder for somebody to get that information and then to do something with it because the internet wasn't developed as it is now. we have a program in place to scan for pii in the older documents, and as we digitize
1:53 am
them we're working to redact that from our digital collections. >> thank you for that. i know that gpo routinely publishes documents and information that originate elsewhere in the federal government. so you are not the original custodian and source of all the information you publish. how does this affect your privacy policies and practices? >> so we are often the middle person in this transaction. and the good example is the one i gave in my testimony about u.s. court opinions. as you can well imagine there are a lot of opinions that folks may not be happy are out there in the world or easily searchable. and while that serves a greater public good, there is that problem with maybe having an
1:54 am
opinion where there's information in there that somebody is uncomfortable with having out in the public. what we've done is have a number of different places -- if you go to it talks about our entire privacy program including who you need to talk to over at the administrative office of u.s. courts to deal with a court document that may have issues. our past gpo system, which is now powered by a modern contact management system and there we've also got links so that if folks find an opinion or something like that that raises an issue, they can ask about it am recan route that to the proper person and make sure they get the right information to talk to the administrative office of the u.s. courts to get
1:55 am
that particular record sealed if that's appropriate. >> thank you very much. i yield back, madam chair. >> thank you very much. and i will recognize the gentleman from georgia now for his questions. mr. latimer. >> thank >> thank you, madam chair. it's good to see you again, i appreciate the work you're doing at gpo. you said something a few minutes ago that was music to my ears and i've been saying this ever since i've been in congress. it's a lesson i learned working in intelligence on the i.t. side of intelligence when i was in the air force. one of the primary principles that was just engraved into our -- every policy we have is, you don't have to secure what you don't have. if you don't need it, then discard it. and we had procedures built around that. to analyze data, whether we needed it or not, we had to justify keeping it and then it
1:56 am
was destroyed. i've been saying this since i've been in congress, especially on the financial services committee as we're dealing with these same issues, the problem is there's many in congress and in the federal government that aren't applying that principle. and so much legislation is being passed that is getting the federal government to grab pour data or forcing businesses to acquire more data of individuals. and somebody's keeping that and storing it. it's data that does not need to be kept. so i appreciate hearing you say that because there's very few government officials or quite frankly, people in congress that you ever hear say that. if you don't need it, don't keep it. so has gpo experienced any data breaches or identified attempted attacks in recent years? >> so, yes on the attack side. no, on the breach side.
1:57 am
the good news is my i.t. security folks are doing their job. and to the best of our knowledge, we have not had a breach, certainly during my tenure or even in the years leading up to it. shortly after i became director of gpo, i still remember it, i was actually at an event on saturday night at my synagogue when i suddenly learned that somebody that we believe was affiliated with the iranians had defaced the front of one of our sites. the good news is, that's how far they got. we were able to restore that site. we are in the process -- we are in the process of moving that over to a more secure
1:58 am
infrastructure. and, you know, the good news is, is that my team has a really good process in place to make sure that our systems are secure, as updates come out they are tested thoroughly before we do those upgrades. so we don't necessarily have -- we don't necessarily have the exposures you might have from unknown vulnerabilities. so we really work hard to try to get that right. >> well, it sounds like you are doing a good job at that, because if you would have answered my question, well, we haven't had any data breaches and no attempted attacks, i would have said, then you're not doing your job, because i guarantee you, you're being attacked, and if you don't know it, then there's something wrong. also the updates, they're crucial. that's what caused the equifax data breach, was simply a failure in the i.t. department to not do a firmware update that was needed to be done. so, you know, i'm really comfortable with the answers
1:59 am
you've given. and that's why we haven't seen any data breaches. in the 30 years i spent in i.t. and data security, i can tell you, the question isn't if you're going to be hacked, it's when. it's going to happen at some point. and it's a race to stay just ahead of the bad guy. i use the analogy, when i used to hunt in alaska, we would go bear hunting, somebody would inevitably put on a pair of tennis shoes, if i see a grizzly, i want to have my tennis shoes on, we would say, you won't see a grizzly, they say, no, but i'll see you. if they can't get to you, they'll go to somebody else. it sounds like i've already heard the answer to this question, but do you have all the tools you need to keep carrying out this level of privacy and security? is there additional things we can do in congress to help?
2:00 am
>> at the moment, i think we're in pretty good shape. we have the flexibility, we have the tools that we need. obviously it's always a question of resources. but as we start automating processes here at gpo, we are really focused on figuring out how we make our dollars go farther. and that's really been one of the hallmarks of gpo all through history. so we're putting that emphasis on security, and trying to do it with tools we've got. and where we don't have tools, we make the investments we need. to some extent, that's one of our advantages. only about 12% of our total revenue comes from appropriations. the rest is what we charge our
2:01 am
customer. so the state department expects us to keep their passports fully secure, and we spend a lot of money doing that. but the state department reimburses us for those investments and the same with our other customers. so we're pretty comfortable with where we are. but we're always happy to talk about things and figure out how to do better. >> this is an ever-changing environment. it's a job that is never done, because once you think you're secure, and you're not staying on top of it, the bad guys will beat us. thank you for all you're doing. madam chair, i yield back. >> thank you for yielding back. mr. aguilar is recognized. >> thank you, madam chair, thanks, director halpern, for the update and information. a lot to review here, and i appreciate your commitment to safeguarding that pii. what i was kind of interested in is -- and you talked about some of the digital versus the
2:02 am
historical documents that you are in charge to protect. your testimony kind of speaks to that and said that 97% of the federal documents now are born digital. so what distinctions does gpo make between the digital versus paper documents in terms of privacy risks and how they can be addressed by your team and by the contractors and how has that transition been to the digital age and the privacy risks that are associated with that on your operational side? >> thanks so much, sir, for that. the short answer is for new documents that we produce, we have a pretty airtight system for trying to scan those documents as we're producing them, looking for pii. as i mentioned, with the federal register, they're under the opm policies regarding pii. so there's not a lot that's
2:03 am
coming in. we do need to check the congressional record. and we work very hard to redact any pii that might appear there. i think even congress is being more circumspect about that stuff going forward. our biggest problem is stuff that exists in a tangible form, so printed paper, that was printed historically, 20, 30 years ago or even a lot longer. and what we're running into are two issues there. one is, as we're trying to digitize that information, so we're taking committee hearings from the '70s or '80s and trying to digitize those and put them on there, both our contractor and our library team are looking at
2:04 am
those documents. where we see or catch pii, we're redacting that from the digital copies that go online. when it comes to those paper documents, we simply don't have the capacity to literally travel the world, to find every copy of the congressional record from the late 1800s forward, check those volumes for pii, and redact them. so we are in some respects reliant on our partners, the federal depository libraries and a lot of other libraries across the world where this information goes. but that said, as i mentioned before, i think because that information is comparatively hard to access, you need to go to a physical place and you need to open a physical volume to
2:05 am
find that, because it's harder to access, that's actually a little bit lower of a threat than what's going on out in the digital domain. it's not a zero threat. and where we find pii, we make an effort to try and redact it. but we just don't have the resources in terms of people, money, and all of those kind of things that do a full scrub for every place where one of these paper documents might exist in the world. >> understood. and how do you kind of keep -- what's the digital kind of hygiene, how do you stay ahead of the evolving threats and the digital privacy landscape piece of it? is it through opm guidance? is it through contractors? or how do you kind of stay ahead of the risks that are in front of you? >> so one of the great things
2:06 am
about being a legislative branch agency is we can take the best practices across the government. we are not necessarily confined by what opm provides. i know our i.t. security team both works with the other cisos here in the legislative branch as well as broader across the government. as we do work for our other federal customers, we're obviously talking to them, figuring out what the requirements are and how we can meet those. so there's a variety of avenues for us to be exposed to the sort of changing landscape and what folks' requirements are. we're always going to make sure our systems are designed in such a way as to protect the security of those documents. whether it's producing smart cards, whether it's the u.s.
2:07 am
passport, one of the reasons they come to gpo is because we can produce those documents securely, in government facilities, with the highest quality you're going to find any place in the world, and do it at a reasonable price, in a reasonable time frame. so we think we have some advantages there. >> thank you so much, i appreciate it. i yield back, madam chair. >> thank you. mr. steil is now recognized. >> thank you very much, madam chairwoman. mr. halpern, i appreciate you being here today. a long day for all of us, we're all looking forward to getting back in person before too long. i appreciate you being here. maybe in a future hearing we'll get an opportunity to talk to some of your legislative branch colleagues on what is an absolutely critical issue. i can't tell you how often i hear from folks across wisconsin about the importance of data privacy. they're thinking about it, they're thinking about being hacked. there's a timely political article that came out just a
2:08 am
little bit ago talking about are reporters being hacked through apps. it is nonstop that this is something we're talking about, and i think it's important that we're talking about it here. the proposed piece of legislation that would mandate gpo action in a the data privacy space, the online privacy act of 2021 that we're talking about, were you consulted in the drafting of the legislation, specifically section v? >> we were not consulted at the front end, although we did have subsequent discussions with the committee. from our reading, it's a fairly flexible standard in the proposed legislation. it's something we think we can meet with the program that we're running today. and it will give us an opportunity to take a look and make sure, if there are any places where we need to tighten things up, we can definitely do that. >> understood, i appreciate your commentary on that, that's
2:09 am
helpful. under national data privacy laws, there's a balance of ways to minimize compliance costs for small businesses while still providing appropriate levels of protection to individuals. can you kind of comment on how you tried to strike that balance? >> so for most of our interactions with small businesses, that's through our print procurement program. and the requirements there are largely driven by our customers. now, that said, there are any number of small and medium sized businesses that are able to participate in our print production, our print procurement program. to give you an idea, i've just -- up in pennsylvania, i think it was a few months ago, and we visited with one of our
2:10 am
contractors that does a lot of work with vpo. and we used that contractor specifically because they have a great system for keeping track of information and keeping track of documents with sensitive information in it, and give us a great paper trail. and they do a really fantastic job of handling those kind of jobs for us and doing it in a way where we're confident that that information is safe and secure. and i think there are some other companies across the country that can do that work for us. there aren't a ton, to be 100% honest with you. when we have run into problems where, say, one contractor has experienced labor shortages, which isn't something that's uncommon these days, there aren't a lot of other places to
2:11 am
go. but some of the small and medium size businesses we've worked with have gotten really, really good at having internal systems that can protect that information. and frankly, sometimes they do a better job of it than some of our larger contractors. >> i appreciate that feedback, mr. halpern. madam chairwoman, i yield back. >> yields back. ms. scanlon is recognized. >> thank you, madam chair. mr. halpern, what remedial measures do the gpo have in place in the event that one of its publications contains nonpublic personal information about a private individual, especially if it's sensitive or otherwise potentially harmful nature? >> absolutely. as i mentioned, we have both automated and manual systems to redact information that is found
2:12 am
in our digital collection. for instance, let's just say for the sake of argument, something slipped through tonight's edition of the congressional record, there was something that had some pii in it, and all of our checks failed. the good news is, when that's brought to our attention, our team can go in, change the record in, our trusted digital repository, and redact that information out of the digital record. we can't really fix the printed copies. but the volume of those copies for something, say, like the congressional record, is much smaller than it was even a decade, decade and a half ago. to give you an idea, in the 1990s, the daily circulation of the congressional record was north of 20,000 copies a day.
2:13 am
today we're only producing about 1,500 copies a day. so it's a much smaller universe in the printed world. and we have tools in place to fix any problems that might occur on the back end through our digital systems. >> i mean, when you were talking about the court systems, that kind of caught my attention, because that's where i spent my precongressional career, and certainly judges will talk about a lot of facts as they're ruling on a case. so, you know, would an affected individual have the right or ability to get personally identifiable information deleted, redacted? >> the way it works right now, it is not our process, it's the courts' process. they can petition the administrative office of the u.s. courts. and it's up to the judge to decide whether or not to seal those particular records or
2:14 am
parts of a record. and depending on that decision, we'll either pull the record out of or put in a redacted version, depending on what the constraints are of the judge's order. as i mentioned in my testimony, this is sort of that gray area. it doesn't necessarily fall into the category of the standard pii, people's social security numbers, other key identifiers. but it may describe facts that somebody's uncomfortable with. and in those kind of cases, we're not going to be the best folks to make a decision as to whether that needs to stay in the public domain or it needs to be removed. the judge who was involved in that case or another judge really needs to be the person
2:15 am
who makes that decision. and depending on what that decision is, we will act accordingly. >> if it was something within your domain, is there currently a right of an individual to be able to get personally identifiable information redacted? if a credit card number slipped through, is there a right to have it removed? >> i can get back to you on what the statutory background is on that. i can tell you for gpo's policies and directives, if we come across that information and have the ability to redact it primarily from our digital collection, we're going to do so. and if somebody brings that to our attention, it may not be the person whose information that is. it could be a third party. but if we become aware of that and we -- you know, we agree
2:16 am
that that's pii, we'll make the effort to redact that from the digital side. >> okay. thank you. madam chair, i yield back. >> thank you. i now recognize the gentlelady from new mexico. >> thank you so very much, chairwoman lofgren. and thank you for holding this hearing, because i can tell you that this issue comes up a lot as i talk to my constituents. they are concerned, and we are concerned with regards to the manner in which our data is protected. dr. halpern, i truly appreciated your comments on how protection of our personal data held by the government is linked to people's trust in our government. and i think you would agree that this link will only grow stronger as more of our lives move into the digital space. i also appreciated your comments on how gpo and other federal
2:17 am
agencies are discussing best practices with each other. do you believe that -- can you elaborate a bit more on what we are doing here at the legislative branch that we might strengthen based on the information you're hearing and sharing with the other federal agencies so that we can make sure we have those protections against the cyberattacks, as well as protecting individuals' personal information as you described it? >> sure, i'll do my best. it happens on a number of different axes, is probably the best way to think of it. as i mentioned before, our computer information security folks have a lot of back and forth and there's a lot of discussion about what are those best practices, how do we secure our systems. and frankly, particularly with
2:18 am
gpo and the congress, there's a lot of interchange that occurs on a daily basis. you're sending us information, we're sending it back to you, and we're working with the library to make things available via so there's a lot of discussion about those particular flows. and frankly, what our cisos are experiencing day to day. where are these attacks coming from, how do you secure against them, and what are the appropriate countermeasures. the other area that we're working on is our obligation as an employer. so i just got the update this morning, right now we're at 1,547 people working directly for gpo, plus a few hundred
2:19 am
contractors. we have an obligation to secure that information, payroll information, health information, all sort of things. the chief administrative officer of the house has a similar obligation. so our officials are talking all the time about how we maintain our systems, both from the cyber standpoint and from the more mundane operational standpoint. and as i mentioned before, my safety and security team literally walks our building. we have a million square feet under roof, and we watch every inch of that building. and one of the things that they look for is, is their pii not being stored properly. that's a little bit harder to do in the congressional space. i spent 30 years working in the house, so it's a little bit different there.
2:20 am
but, you know, some of those same practices, whether it's the chief administrative officer or the secretary of the senate or the house sergeant at arms, all of us are talking all the time to try to figure out what's the best way to go forward, and there's a lot of information-sharing going on. >> thank you so much. i take it you have no control over, you know, the fact that -- and i think we'll hear testimony that every search is saved and mined, right? so as individuals might be searching within the congressional record for information, that -- tell me, do you have any way of protecting any of those searches? or is that something that's left to the individual with regards to his or her computer use and privacy? >> so it's a great question. and i will get you a more detailed answer as to the information we retain and don't retain. the short answer is, we try to retain the minimum amount of
2:21 am
information we possibly can about searches in govinfo or any of our other systems. so, you know, for instance, in the case of govinfo, we use the smallest, most mundane kinds of cookies we possibly can. we don't track people across the internet. and frankly, unlike, we don't have log-ins. we don't persistently save those searches for folks. >> i see my time has expired. i do thank you and your organization for respecting, as is proper, the privacy of the users in the documents and in the searches. i yield back, madam chair. >> thank you. just a couple of further questions. first, director halpern, i just want to commend you and your staff for your proactive work
2:22 am
protecting privacy. it's really admirable. and, you know, in your statement you said you can't -- if you don't save the material, it can't be abused. and that's actually the principle of the online privacy act, to really prevent the unnecessary retention of information, because if you don't have it, you can't data mine it, you can't abuse it. and that's what you're doing in your agency, and something that i think we want to look at. because if you're using a search engine to search something in the congressional record, you know, the search engine retains it even though you don't. so government -- you're doing the right thing, but people's searches or inquiries can still be abused and manipulated.
2:23 am
just two quick questions. first, really you're doing everything with nist and omb guidance and your directives, you're doing that because you want to do the right thing. but it's not -- your policies are not required by law at this time, are they? >> no. and the short answer is, it's one of the blessings and curses of being in the legislative branch. i actually -- i'm a big article i guy, so i actually like the flexibility that this gives us to survey the landscape and pick the policies that meet our values. and our values are, we're going to protect our customer. we're going to be transparent with folks. and we're going to do everything we do, whether it's with our folks here in the building or
2:24 am
our customers, in a respectful way. and part of being respectful is protecting that data. and, you know, the good news is, while there may not be pure statutory requirements that say you have to do this, one, we believe that our oversight committees expect us to do this, and it's the right thing to expect. and two, our customers expect us to do it. and both of those are driving imperatives for us here at gpo. and it helps us get into the position where it's easier for us to do the right thing that we were planning to do anyway. >> just one final question. in your written testimony, you mentioned that gpo deficiency between high impact and low
2:25 am
impact personally identifiable information. and i'm interested in how or whether you reevaluate the level of risk for different types of pii that may change over time. and as you classify that, do you consider the potential impact when information is aggravated by third party in combination with other personally identifiable information, for example through the use of machine learning or other data processing techniques that are outside of your agency? >> we are always trying to learn. and, you know, there are a lot of very clever folks all across the country who are coming up with new ways to use that sort of ubiquitous data that's out there. we do draw a distinction between certain kinds of pii. so for instance, those unique
2:26 am
identifiers, your social security number, the card numbers that -- or passport numbers that the department of state or department of homeland security uses, those are really key pieces of information that have the potential to unlock so much else, including biometrics and a whole slew of other things. we're going to work much harder to protect that information. lower impact data are things like addresses, photos, name. that stuff is really kind of ubiquitous. so for instance, when you do a special order and you're congratulating the coach of your high school football team, you're going to call that person by their name. that is pii. but we're not going to redact that from the congressional record because that, one, isn't a huge security risk, and two,
2:27 am
if we did, it totally eviscerates the point that have special order. >> exactly. i want to thank you, director halpern, for your great testimony today. as you know, from your years here on the hill, we do keep the record open if we have additional questions. if that occurs, we'll send them right on to you and we know you'll get back to us. i want to thank you for being here with us today, it's been very informative. please let your team know how much we think of them and how proud we are of them. >> thank you very much, ma'am, i really appreciate it. >> we will go to our second panel of witnesses. first we have shoshana zuboff who is the charles edward wilson professor emirata at harvard
2:28 am
business school, an internationally recognized expert and the author of several major works on digital privacy issues such as "in the age of the smart machine: the future of work and power" and "the support economy: why corporations are failing individuals." and the next episode of capitalism. her most recent book, "the age of surveillance capitalism: the fight for a human future at the new frontier of power" investigates the new surveillance economy which is driven by consumer data. our next witness is caitriona fitzgerald, deputy director at the electronic privacy information center or e.p.i.c. she leads e.p.i.c.'s policy work. she recently authored "grading on a curve: privacy legislation in the 116th congress," which
2:29 am
sets out the key elements of modern privacy law, including the creation of u.s. data protection agency. the next witness is marshall erwin who is the chief security officer of mozilla, where he focuses on data security, privacy, and surveillance. he has previously worked as a counterterrorism and cybersecurity analyst, and has served as the counterterrorism and intelligence adviser to senator susan collins on the senate homeland security and government affairs committee and was the intelligence specialist at the congressional research service. and finally we have daniel castro, vice president of the information technology and innovation foundation or itif, and the director of the center for data innovation, an itif-affiliated research institute focusing on the intersection of data, technology, and public policy.
2:30 am
i'll remind the witnesses that your entire written statements, which by the way are excellent, will be included and made part of our permanent record, and that that record will remain open for at least five days for additional material to be submitted. and we ask that you summarize your testimony in about five minutes so that the members of the committee will have time to pose questions to you. so first, professor zuboff, we would love to hear from you. >> members of the committee, i'm delighted to have -- oh, is my -- oh, you've muted me, okay. >> we can hear you. >> shall i start again? >> sure, go ahead. >> chair lofgren, ranking member davis, and members of the committee, thank you so much for this opportunity to discuss the challenges of privacy, privacy law, in a world without privacy. i have spent the last 43 years
2:31 am
of my life studying the rise of the digital as an economic force that's driving our transformation into an information civilization. over these last two decades, i've observed as the fledgling internet companies morphed into surveillance-based economic order founded on the premise that privacy must fall and powered by economic operations that i have called surveillance capitalism. surveillance capitalism maintains core elements of traditional capitalism, market exchange, growth, profit. but these cannot be realized without the technologies and social relations of surveillance. hidden methods of observation secretly extract human experience until recently considered private and translate it into behavioral data.
2:32 am
these methods operate outside of human awareness, engineered to do so, robbing actors of the right to know and with it the right to combat. ill-gotten human data are then immediately claimed as corporate property, private property, available for aggregation, computation, prediction, targeting, modification, and sales. the theory of surveillance capitalism challenges this property and redefines it as theft. surveillance capitalism was invented at google during the financial emergency of the dot-com bust. it migrated to facebook, became the default model of the tech sector, and is now reordering diverse industries from insurance, retail, banking, and finance, to agricultural, automobiles, education, health care, and much, much more. as one tech executive recently described it to me, all software
2:33 am
design assumes that all data should be collected and most of this occurs without the user's knowledge. all roads to economic and social participation now lead through surveillance capitalism's institutional terrain, a condition that has only intensified during these two years of global plague. the abdication of these spaces to surveillance capitalism has become the meta crisis of every republic because it obstructs solution to all the other crises for which we require information integrity and the sanctity of communications. the world's liberal democracies now confront a tragedy of the uncommons, as mission critical information and communication spaces that most people have assumed to be public are now owned, operated, and mediated by
2:34 am
private commercial interests for profit maximization while almost entirely unconstrained by public law. no democracy can survive these conditions. the deficit i describe here reflects a larger pattern. the united states and the world's liberal democracies have thus far failed to construct a coherent political vision of a digital century that advances democratic values, principles, and government. while the chinese, for example, in contrast, have focused on designing and deploying digital technologies in ways that advance their system of authoritarian rule. this failure left a void where democracy should be, leaving our citizens now to march naked into the third decade of surveillance capitalism without the rights,
2:35 am
laws, and institutions necessary for a democratic digital future. instead, we've stumbled into an accidental dystopia, a future that we did not and would not choose. in my view, we must not and cannot allow this to be our legacy. survey evidence, some of which has already been referred to here, now shows that americans are moving ahead of their lawmakers in a massive rupture of faith with these companies. we see extraordinary majorities calling for action to curb the unaccountable social power of these firms, a realization that the emperor of surveillance capitalism not only has no clothes, but is dangerous, and a clear sense that human rights, the durability of society, and
2:36 am
democracy itself, are on the line. what was undiscussible has become discussible. what was settled is now being questioned. to end, we are still in the very early days of our information civilization. this third decade is a crucial opportunity to build the foundations for a democratic digital century. democracy may be under siege, but it is, however paradoxically, the kind of siege that only democracy can end. thank you so much for your attention. >> thank you very much, professor. let me call now on ms. fitzgerald for her testimony. >> thank you, chair lofgren, ranking member davis, members of the committee. thank you for holding this important hearing and the opportunity to testify today. i'm caitriona fitzgerald, deputy
2:37 am
director at e.p.i.c. e.p.i.c. is an independent nonprofit research organization in washington, dc, established in 1994 to protect privacy, freedom of expression, and democratic values in the information age. for over 25 years e.p.i.c. has been ail leading advocate for privacy in both public and private sectors. the united states faces a data privacy crisis. large and powerful technology companies invade our private lives, spy on our families, and gather the most intimate details about us for profit. these companies have more economic and political power than many countries and states. through a vast, opaque system of databases and algorithms, we are profiled and sorted into winners and losers based on data about our health, finances, location, race, and other personal information. and private companies are not the only problem. government agencies have also dramatically increased their
2:38 am
collection and use of personal data, often purchased or obtained from those same private companies while failing to address the significant risks to privacy and cybersecurity. the impact of this uncontrolled data collection and use by both private companies and the government is especially harmful for marginalized communities fostering systemic inequities. these industries and systems have gone unregulated for more than two decades and this is where it's left us. the system is broken. technology companies have too much power and individuals have too little. to restore the balance, we need comprehensive baseline privacy protections for every person in the united states, changes to the business models that have led to today's commercial surveillance systems, and limits on government access to personal data. most crucially, we need strong enforcement of privacy protection. in my written statement i go into detail about the crises we
2:39 am
face and the elements of the strong privacy law but i want to focus here on the importance of enforcement. without enforcement, many businesses will simply ignore privacy laws and accept the small risk of an enforcement action as the cost of business, as we've seen with privacy laws enacted in europe and in several statements. strong enforcement miscellaneous include a private right of action. this is not new. congress included private rights of action in the cable communications privacy act, the video privacy protection act, fitgra, and the drivers privacy protection act. the damages are not large in individual cases but they really provide an incentive for companies to comply with privacy laws. strong enforcement also requires congress to establish an independent data protection agency or a dpa. the united states is one of the few democracies in the world that does not have a data protection agency. u.s. companies are leaders in technology and the u.s. government should be a leader in technology policy.
2:40 am
when you think about the privacy technology in our lives and economy, it seems obvious we should have a federal agency dedicated to overseeing and regulating it. data privacy could protect privacy by setting rules that help level the playing field for smaller technology companies who currently struggle to compete with tech giants. a dpa could be a central authority within the federal government on privacy issues. we saw a glaring example of agency failure to properly consider privacy risks of data collection just this month. the irs sparked outrage after reports surface that can it had contracted with a third party vendor to require taxpayers to submit to facial recognition in order to access tax records online. thankfully, pressure from advocates and members of congress caused the irs to backtrack, but the contract should never have been signed in the first place. a dpa could ensure privacy is
2:41 am
carefully considered when agencies are weighing contracts like this. a recent data for progress poll showed 78% of americans across the political spectrum support establishing a data protection agency. the good news is that congress and indeed this committee has a strong bill before it that would restore online privacy for americans. filed by chair lofgren and representative eshoo, it would establish strong enforcement mechanisms via a private right of action and the creation of a u.s. data privacy agency. e.p.i.c. recommends swift action. it's time for congress to act. thank you for the opportunity to testify today. >> thank you very much, ms. fitzgerald. we'll turn you to, mr. erwin, for your testimony. >> thank you, chairperson
2:42 am
lofgren, ranking member davis and members of the committee. thank you for the opportunity to testify today. i'm the chief security officer at mozilla, a nonprofit organization and open source community. our products include the firefox browser which is used by hundreds of millions of people around the world. mozilla is constantly investing in security and privacy and advancing movements to build a healthier internet. in our view, privacy i don't mean online is a mess today. users are stuck in a cycle where their data is collected without their knowledge or understanding. that data is used to manipulate them in ways that could be actively harmful. today i'll talk about what mozilla is doing to address this problem and what congress needs to do. we've been driving major initiatives such as the standardization of tls 1.3, encrypted free nonprofit
2:43 am
certificate authority. this has resulted in increased web traffic to 85 to 90%. this is protecting your browsing activity and information from attackers in the middle of a network that would otherwise protect that information. the second big area of focus for us has been on eliminating what we call cross site tracking. these are parties following you around from website to website that you visit, collecting information about your browsing activity and building that profile about you. from 2019 we turned on what we called enhanced tracking protection in the browser. we turned that on by default because we believe the onus shouldn't be on consumers to product themselves from opaque risks they can't see or understand. so that's just a little bit what have we've been doing to try to address this problem. i want to talk a little bit about what we think congress needs to do. first, focusing on baseline federal privacy legislation, then i'll talk a little bit about transparency and online harms. first, in our view, technical
2:44 am
privacy protections from companies and baseline regulations are complementary and necessary, neither alone are sufficient. the internet was not designed with privacy and security in mind which is why technical solutions like the ones that we offer are necessary to create a less permissive environment overall. at the same time we can't solve every privacy problem with a technical fix. there is a central role for regulation here as well. to give you just one example we know that dark patterns are pervasive across the just and applications that people use. unfortunately there is little that a browser can do when a person visits a website directly and is deceived into handing over their information without proper consent or understanding. this is where law plus a robust enforcement regime has to step in. the online privacy act would prohibit dark patterns. the u.s. must enact baseline
2:45 am
privacy protections to make sure the public and private actors treat consumers fairly. we need clear removals the road for entities using data, strong privacy rights for people interacting with those entities, and effective authorities, agencies with the authority to take enforcement action. second, it's important to complement federal privacy legislation with solutions that provide direct transparency and accountability into online harms. many of the harms that we see today are a direct result of the pervasive data collection happening. we know from disclosures things like recommendation systems and targeting systems can have really pernicious effects if abused. these systems are really powered by people's data. it is easier to discriminate against people, manipulate people, or deceive people if you know more about them. this is why privacy and some of these online harms are inherently [ inaudible ]. the problem is this harm is mostly hidden from the public and regulators. that's why we have called for things like an establishment of
2:46 am
a safe harbor to protect researchers doing investigations into the activities of major online platforms. this would protect research into the public interest, help us better understand the harm happening on these platforms. there's a real public benefit that can be had here by such a safe harbor. similarly, we've advocated for more robust regimes for governing online ads. we've been leading the push for ad disclosure in the european union and are encouraged by recent proposals in congress that would require disclosure of ads for public benefit and understanding. these approaches would provide transparency into the opaque world of online advertising and could be done without creating privacy risks. in conclusion, at mozilla we speak to advance privacy technology and to ensure that privacy considerations are at the forefront of policymakers' minds in considering how to protect consumers and grow the economy. we appreciate the committee's focus on these issues and look forward to the discussion. >> thank you, mr. erwin.
2:47 am
and now we turn to our last but certainly not our least witness, mr. castro, for your testimony. >> thank you, chairperson loveagain, ranking member davis and members of the committee, i appreciate the invitation to speak with you today. u.s. data privacy is at a crossroads. many consumers are justifiably frustrated by the frequency with which they learn about new data breaches and the seeming lack of accountability for those who misuse their personal information. at the same time many firms are overwhelmed by a tsunami of new data protection obligations and growing restrictions on how they can use personal information for legitimate business purposes. and both are often confused by the multitude of ever-changing laws and regulations. three states, california, virginia, and colorado, have passed comprehensive data privacy legislation and many other states are considering it. over the past three years, 34 state legislators have introduced a total of 72 bills
2:48 am
and more coming. these new state privacy laws can impose significant costs on businesses, both direct compliance costs and decreases in productivity, and undermine their ability to responsibly use data to innovate and deliver value to consumers. moreover these laws create high costs not just for in-state businesses but for out of state businesses that can find themselves subject to duplicative laws. california's law will likely cost $78 billion annually. in the absence of federal data privacy legislation, the growing patchwork of state privacy laws could impose out of state costs between $98 billion and $112 billion annually. over a ten-year period the costs would exceed $1 trillion. the burden on small businesses would be substantial with u.s. small businesses bearing 20 to $23 billion annually. the united states needs a new
2:49 am
federal data privacy law. it should not back away from the light touch approach it has historically taken to regulating the digital economy. instead, it should focus on the following goals. first, data privacy legislation should establish basic consumer data rights. for example, congress should give individuals the right to know how organizations collect and use their personal data or when their information has been part of a data breach. congress should also give consumers the right to access, delete, or rectify census data in certain connects. for example, consumers should have a right to obtain a coffee their health or financial data and move it to a competing service. second, lawmakers should establish uniform privacy rules for the entire nation by preempting state and local privacy laws. consumers should have the same protections regardless of where they live and companies should not be faced with 50 different sets of laws and regulations. a patchwork of state laws with varying definitions and standards creates a complex regulatory minefield for businesses to navigate,
2:50 am
especially if potential violations risk costly litigation. third, there should be robust enforcement the federal privacy law. congress should rely on federal and state regulators to hold violators accountable. to avoid unnecessary litigation, businesses should have a reasonable period of time to address a violation without penalty in cases with no demonstrable consumer harm such as a 60-day notice to secure period. fourth, congress should set a goal of repealing and replacing potentially duplicative and contradictory laws. the u.s. code is littered with privacy statutes, from major ones to narrow ones. each one has its own set of definitions and rules to comply with. finally, federal data privacy legislation should minimize the impact on innovation. to that end, congress should not
2:51 am
include data minimization, or privacy by design requirements because these provisions can reduce access to data, limit data sharing, and constrain its use thereby limiting beneficial innovation. congress has the benefit of hindsight to avoid some of the problems found in other data privacy laws, particularly europe's gdpr. the gdpr has imposed a massive compliance cost on businesses not only in the eu but around the world, for example fortune 500 companies have spend nearly $8 billion to comply with the gdpr. the eu's own survey shows the law has had virtually no impact on consumer trust in the internet. some of the most expensive requirements like appointing a data protection officer serves no valuable business function. policymakers should also be aware of the unintended consequences of poorly crafted legislation such as not considering the implications of the law on emerging technologies like artificial intelligence and blockchain.
2:52 am
finally, congress should note that the primary purpose of the gdpr was to harmonize laws across eu member states. ironically they're creating the exact type of fragmentation in the u.s. that the eu created the gdpr to solve. therefore it's essential for congress to act swiftly to pass comprehensive privacy legislation that establishes basic consumer data rights. thank you for the opportunity to be here today. i look forward to any questions. >> thank you very much, mr. castro and to all of our witnesses. we'll go now to members of the committee who have questions. i'll turn first to our ranking member, mr. davis. >> thank you, madam chair, thanks again to our witnesses. i enjoyed your testimony. mr. castro, let me start with you. there have been proposals introduced in congress that if implemented, would create a private right of action for individuals and a collective right of action for certain nonprofits to sue on behalf of individuals. is this the best approach to
2:53 am
ensure protection and accountability for a federal data privacy framework? >> no, i don't think it is. right now we have a very effective regulator and we have state attorney generals that can take up that slack. what we've seen in states that have implemented these private rights of action like with the biometric privacy act, once a court ruled that you could bring lawsuits under this for cases where there is no actual harm to consumers, we saw a flood of lawsuits, and these lawsuits are being driven by class action, you know, lawsuits where they're just going after the attorney fees. this has led to very significant fines without payoff for consumers. in the past year we've seen over $100 million in settlements for companies like facebook, adp, walmart, six flags, wendy's, tiktok, and the list goes on. and these aren't instances, again, where we've seen any consumer harm. it was pure technical
2:54 am
violations. that's not an effective use of dollars that could be going back into investing in technology and processes that would actually protect consumers' privacy, make it more secure, and protect them from harm. so a private right of action to me is just taking money out that could be used on actual protections for consumers. >> thank you, mr. castro. i'm sure you've read the media reports about special counsel john durham's allegations that democrat political attorneys, perhaps marc elias specifically, infiltrated servers at the trump white house. would it have protected president trump from data hacking attempts? >> it depends, certainly privacy legislation would restrict what companies might keep on hand and what they might share. but at the end of the day, what you really want to see if you want better security is better transparency, so consumers know what they're getting, in this
2:55 am
case the trump organization, trump campaign, would know what kind of security and privacy controls they're getting when they purchase a service, and they'll know how their data can be shared. getting more transparency is really what we should be looking for in the future. >> i agree transparency matters. but so do penalties >> i agree, transparency matters but so do penalties for bad actors if the allegations are born out. do you know if the ftc would have allegation penalties if the allegations are born out? >> the ftc can impose penalties on companies for violations. particularly, the way this works right now, most companies have statements in their privacy policies or terms of service saying we respect your privacy and will secure your data. when they do not do that, that's misrepresentation and ftc can and has gone after companies for that. we can certainly consider giving ftc more authority, if we increase budget, saying exactly
2:56 am
what to do with that additional money, i think that's the best way to go forward. what we don't want to do is give the ftc too much latitude though, doing its own rulemaking that goes outside of congressional intent, because it becomes very risky that the ftc made decide they want to start deciding how companies design websites or mobile apps all under the name of maybe insuring consumers aren't manipulated but on the other hand ftc isn't really made up of engineers and user experience designers. >> very good points, but this agency does address to exist concerns already. the online privacy act of 2021, bill referred secondarily to this committee calls for creation of a new agency called digital privacy agency, not only an example of congress shifting authority to the executive branch but seems like an extreme
2:57 am
waste of resources given the existing structure already established in the federal trade commission and with the state's attorney general. mr. castro do you believe a new federal agency is necessary to accomplish the goal of data privacy? >> i think the ftc has the authority it needs and reputation experience to do this job. i don't think we need a new agency to solve that problem. i also mentioned when we think about protecting consumer privacy it overlaps with fraud and security and that is something ftc should be focused on. i think if we split up that mission we weaken for consumers. >> thank you for joining us and to all the witnesses i yield back the rest of my time, madam chair. >> thank you. mr. raskin, you are now recognized five minutes. >> thank you, madam chair. ms. fitzgerald, i was eager to ask you the question about the aggregation of data, about the aggregation and processing of
2:58 am
data in large quantities can pose risks that might not be apparent just talking about a single piece of discreet information, and what can be done about that. >> thank you, representative raskin. i think the lesson we can take from all of these data points being combined about us is one, shows how important it is to have a comprehensive privacy law that covers personal data generally rather than the laws, hipaa, shrca, et cetera. the acronyms go on, and these personal data also show why commercializing intent doesn't work, it's just not available for the vast majority of internet users to grasp, may have consent of the website to use their data but are they consenting to be passed on to
2:59 am
thousands of brokers never heard of? we urd of this, when a news outlet, got location from dating app grinder, i know you and members of committee signed rule making for protection data and that could be integrated into privacy legislation. >> thank you so much. i've read your book about surveillance capitalism and know it's hard to synthesize an entire book but who is big brother today? is big brother the state as we traditionally conceived it or is big brother the corporate sector or some combination there of? >> well, thank you so much senator raskin, i'm thrilled and flattered that you read my book
3:00 am
so thank you for that as well. as you know, because you read the book i develop a concept that i call "big other" so big brother was a notion of a totalitarian state which had a very specific ideology and wanted people to believe certain things and talk in certain ways and act in certain ways. we have surveillance capitalism, not a totalitarian state. and surveillance capitalism really doesn't care what you think. it doesn't care what you believe. it doesn't care how you act. what it does care about is that you think and believe and act in ways that it can capture the data. to do the aggregation that you are asking about, because with that aggregation is the possibility of computation,
3:01 am
applying artificial intelligence, coming up with predictive algorithms, targeting methodologies. with those targeting methodologies, various functions are achieved, including increasing engagement which is a euphemism for increasing the footprint for greater data extraction, and also, as we increase engagement, using our knowledge from the deep computation ai, using knowledge for things like subalmost nl cues, psychological micro targeting, engineered social comparison dynamics so we really know a lot about you and that way, we can use messaging and different, not only the content of messaging, but the forms of messaging to actually begin to
3:02 am
modify your behavior and do that in a way that is consistent with the commercial objectives of the company and its business customers. >> so -- >> so at the end of the day, this big other is, sorry, i'll just wrap it up with the notion that this big other is ubiquitous, it's pervasive, it's not totalitarian, but it is full-on control and full-on opacity, full on power. >> well, we haven't yesterday nakted comprehensive federal privacy stature to address the rise of the big other, but a lot of states in some other countries have attempted to do so. i wonder if you can just tell us what you think have been the best lessons and the best efforts at comprehensive privacy frame work to protect us individually, but also to protect the peoples' possession of interest and characteristics
3:03 am
and their own behavior. >> well the gold standard right now, and we could of said this a few weeks ago, but the digital services act and digital markets act, in the eu, these are, you know, not the total solution. we have very specific things to accomplish ahead of us but as far as setting a new standard of defining a new frontier for the kinds of rights, i want to use the word users. i try to do it with air quotes because users is all of us, you know, user as kind of a synonym for humanity at this point, and these pieces of legislation, they're ambitious, comprehensive, they have the buy-in of the european parliament now and the member states, few details left to be worked out in the spring, but these set a new standard for the
3:04 am
rights, the enforcement powers, the rights of users, enforcement powers of the governments to really, for the first time, put democratic governance back in control of our information and communication spaces to work with the private sector rather than have the private sector as what it has been for the past two decades, a completely autonomous force, essentially unimpeded by relevant law, and all of the social harms we've been describing this afternoon are the result of that situation. >> thank you so much and gentleman's time expired we turn to the gentleman from georgia. >> thank you madam chair and this very interesting topic we've been dealing with, i've been talking about quite some
3:05 am
time. mr. castor one of the issues i've seen from a perspective is the privacy laws we have across the country and states, you mentioned in your testimony, 34 state legislatures that introduced bills in the space from 2018 to 2021 and i want to get your thoughts on the importance of having a uniform federal standard for data privacy that, you know, would go across the board. >> really appreciate that question, thank you so much. when you think about, especially the smaller companies, people often think about, you know, facebook and google but these laws impact everyone, impact the local florist, barber shop,
3:06 am
grocery stores and complying with those laws can be very expensive and difficult because even though at the end of the day, most of these laws are basically saying the same thing, you know, say it all a little differently and holding the laws so you're not in violation, for there's some technical violation if you for example not put a certain term up on your privacy policy in website may be in violation of california's law. all of these obligations add costs for businesses and these businesses, you know, instead of hiring another engineer or, you know, giving a raise to their workers, they have to go out and hire a privacy lawyer and a lot of privacy lawyers really like privacy laws because it's good business for them and the issue here is we don't need more laws,
3:07 am
we need one good law and i'm glad we're having this conversation, i think we can get to that good law, hopefully sooner rather than later. >> so what kind of effect does this have on american competitiveness in the international market place? >> it's very significant right now. when we think about, you know, a lot of u.s. companies are doing a very good job of protecting privacy better than countries for example in china. but, you know, because the u.s. does not have federal law and we've taken a sectoral approach and other country advise not taken that sectoral approach, a lot of countries will say the u.s. doesn't care about privacy. i don't think that's true, i don't think that's true when we hear about the ftc defending the work they're doing, i think some of the state laws are being effective so i don't think it's fair to say the u.s. is doing
3:08 am
good work on privacy but we don't have international privacy law, so that is used so a u.s. company does not get a particular contract. we're seeing that used now, in part, because european lawmakers are saying the u.s. isn't doing enough to protect consumer privacy. so i think one of the best ways we can, you know, promote u.s. competitiveness, not just in the tech sector but across the board is to create this federal law so we can show the u.s. is taking this seriously. >> i think everybody on this committee is committed to and i think most individuals recognize the importance of protecting personal data and an individual's privacy and definitely, i do as well, and i know you do. but with that being said, you mentioned in your testimony, a concern that i share as well which is the compliance cost to small business and state privacy laws. is there a state you can point to that successfully executed a data privacy law that balances
3:09 am
those concerns? >> i think virginia is probably the closest we've gotten so far, you know, they're trig to strike the right balance. problem, when you see these laws, the state legislatures, you're not seeing the cost born by states outside their border so as we saw each new state on this path, you're putting up laws that force companies to rehire lawyers, redo privacy policies, make changes to their system, all the same compliance without discernable improvement in privacy. so if the goal is consumer privacy, let's address this top-down rather than have 50 new rules how to do it and have so much consumer confusion on what's going on and who is protecting your data. >> okay. i agree, i see my time is quickly running out, madam chair, i yield back. >> mr. aguilar is recognized. >> thank you, madam chair, i
3:10 am
want to talk with you a little bit about mozilla has learned from its own experience building privacy protective tools. there may be relevance in the public sector and government agencies we have oversight to handle and minimize risk for their own operations. what are your thought on what you learn and what is applicable to the public space. >> so i think there are two important ways tee approach this question, first is to think about our core privacy practices and the data we'll collect from consumers, and how we're going to be really transparnlt and provide them with control of that. we really work hard to refer to these as we we refer to as data optimization practices, really only building what we need for our business and to create a good product for users and that
3:11 am
isn't followed by expansively across our industries and the underlying problems we need to see fixed and once we have that data we work to protect it pretty aggressively which i think basically, this idea of data minimization and security controls on the back end should really apply to the public and private sector. we also work really hard to build as i mentioned earlier, these privacy features into the browser to protect users from web encryption, blocking, tracking in the browser. the interesting experience we have there, that can be a little bit disruptive for the industry in a way that like, right now, we have a really permissive internet, a large number of parties depend on that permissiveness to build on data practices and fund business, that's a problem so some amount of disruption is healthy. we don't always get out a positive response when we roll out privacy features and i think that's a signal we're going in
3:12 am
the right direction, but certainly creates headwind. >> understand sometimes being at the forefront you take on a lot of folks with different view points as well. ms. fitzgerald, what are some of the best ways we can limit the risks and harms of the aggregations of personal information over time from multiple different sources that we know exist? >> thank you representative, i think the innovation that's been talked about a few times, especially in the first panel, was thinking if private companies triggered our data with what epo did, would be in much better shape. the point was made, if you don't have it, you don't have to protect it. so the obligation should be on the companies to minimize the amount of data they're collecting, you know, to leave it when they don't need it anymore. that protects users from having information passed on and in case of a data breach.
3:13 am
>> thanks, appreciate that. back again, what are some of the privacy risks and harms we have that may not have, you know, primarily technological solution. i guess what i'm trying to ask is, you know, issues that instead require changes to business practices or laws, we're trying to balance regulatory side here and wonder your thought on how to balance those pieces. >> yeah, so we bump into this problem all the time in our work on the browser where for example the challenges we can solve, you visit a website for example and there's a hidden third party on the website the user doesn't know about and is collecting data on them, that's a third party relationship, we can do something in the browser, because there's a network request that calls out to that third party so we can just block that. that's typically the role of the
3:14 am
browser or the operating system can jump in and say we'll prevent third parties from tracking your behavior. what we can't do as much of is when the user has a direct, what refer to as first party relationship with the website or business and that comes down to a question of business practices not technical problems. there's little we can do to adjudicate the relationship between a user who decided to engage directly with a first party business but the business isn't up standing and that's where the user may not know about it and that's where law and regulation need to see step in where technical fixes don't get what we need. >> appreciate that, yield back. >> gentlemen yields back. mr. spouser recognized. >> thank you, i want to shift to you if i can, ms. fitzgerald on the topic. in particular i was reading your testimony and you referenced
3:15 am
biometric data is particularly sensitive and deserves stricter regulation, correct? >> yes, absolutely. >> and in particular, raised concerns about how healthcare data is used, is that correct? >> so i think a lot of people think hipaa is protecting healthcare data with extent to health apps, that's not correct, would you agree? >> right. hipaa only applies to relationship between doctor and health insurance company and see certain health exchanges, doesn't cover health data in apps. >> yeah, i think that's an interesting distinction i think a lot of americans might find interesting and building on what was previously discussed with mr. irwin, these third party apps and relationships that exist on certain website and see the impact that could have. so congress recently passed the 21 century cures act to protect
3:16 am
patient data or require patient data to move into doctors office healthcare apps of the patient's choosing, a lot of reasons it's a good thing, things we want to make sure we get right, because you want to be able to make sure patients control their data but also ensure they don't lose control of their data by sending it in a third party that they might not be aware of, with interest in healthcare data, biometric data or others. do you think the risk is bigger in some of these apps that might send, that individuals, that the risk might be greater if they're sending information to some of these types of apps kind of under the rules and regulations of 21st century cares act? >> the reason biometric data is so sensitive is we can't change it. we can't change finger print. >> i totally agree, great topic
3:17 am
let's shift and talk about medical data. does that create unique concern? >> yes, because of the ways it can be used against us, think insurance companies using it to determine our rates or even employer possibly not wanting to hire someone with health condition if they're paying the health insurance premium. >> i appreciate the commentary, i think it's an area to really keep an eye on. i think there are positive things, you know, we want patients to control their data, but at the same time we want to make sure that patients are protected in particular where there might be a third party player involved where the patient may not fully appreciate the risk they're engaging in sharing their data. something we should really spend some time and think about. let me shift gears if i k can, ms. fitzgerald. mr. castro, there's a lot of laws on policy, for government
3:18 am
institutions, federal law requires companies to explain information sharing companies, safeguard some of their data, before new laws, i think there's a real opportunity for congress to look at how current privacy laws are working and how they interact with each other so that we're not duplicative of regulations, could you shine some light since the passage of gdpr in the eu, what have we seen in that sweeping legislation? is it stifled innovation, grown innovation? what big take-aways should we take from that from the eu's work? >> in many areas, certainly stifled innovation, when you look at the cost of trailer, the impact it's had in start-up economy, many areas where the data points show european business are suffering. >> so only because we got a
3:19 am
little bit of time, if we shift gears slightly and say let's jump over to california, california's consumer privacy act, what lessons can we take away from the state of california? >> one of the biggest problems with california, we've seen hundreds of lawsuits related to enforcement. that's been a problem, one of the biggest positives of it is the idea of a 30 day cure which i think could be extended nationally where the goal is not to penalize for mistakes but take corrective action. >> i think what's really important is got a lot of laws on the books, state laws on the eu, look at what financial services committee talk about the services app and i think it's critical we're looking at all these legislations, taking lessons from that, so we help consumers protect that data but also maintain innovation here in america. thank you, chairwoman, and i
3:20 am
yield back. >> ms. scalin in recognized. >> thank you, madam chair. i guess i'd like to pull on the thread a little more since we certainly do hear quite a bit about this on the company side and the agency side, how do they navigate the different laws between states but also countries, as we're well aware the internet does not have national boundaries in the same way. ms. fitzgerald, could you talk about that. >> what are the most important privacy rights you've seen that you think congress should work on adopting? >> yeah, the most important thing i think when thinking about privacy legislation is you can't just be telling people what companies are collecting about them. we need to put obligations on the companies collecting data to limit that collection, limit what they're doing, after they collect it, and have them delete
3:21 am
it when they're done using it. >> in terms, do you have suggestions about which are the best models for us to look at or whether the eu has been mentioned? >> sure, so i think, you know, state models, you know, california has the strongest right now in the united states, the ccpa, california with the strongest model we've seen but that still is limited, only allows users to opt out of the sale of their personal data. it doesn't as much but obligations on companies to minimize the data they're collecting. you know, the online privacy act, by chair lofgren is a great model that includes data minimization, also includes provision for protecting against discriminatory uses of data, that's really upon the, and also want to make sure you're requiring algorithmic fairness
3:22 am
and accountability, we don't want to make the same mistakes with ai that we've made with data collection. this is the next, you know, wave of technology, let's get ahead of t set the rules now, encourage companies to innovate around privacy and autonomy. >> okay. i mean one of the typical, i don't know, tools that people talk about with the privacy laws is the notion of consent and i gather that you're a little bit sceptical about the usefulness of consent as a focus of privacy protections. can you talk about that a little bit? >> yeah, so there's just no way for an individual to understand kind of the web that their data gets passed through if they're consenting, you know, if you're just hitting that consent button to make the banner go away on a website, you don't realize that
3:23 am
sure, there's no rules. >> certainly, we've seen, certainly to get access, you either decide to click or decide not to. so with respect to discriminatory data uses, how do we address that, like your top line recommendations there. >> now, groups such as color of change, committee for civil rights, leadership conference have great proposals on what we need to do to make sure that civil rights are protected online, you know, making sure that public accommodations are protected, you know, that extends to online spaces. you need to make sure that the algorithms that companies are using are not being used in ways that deprive people of opportunities like, you know,
3:24 am
ads for housing and ads for jobs, that those are, data are, our personal data is being used to show those in discriminatory ways. >> okay. thank you, madam chair, i yield back. >> gentlelady from new mexico is now recognized. >> thank you so much, madam chair. as we heard today, our personal data is extremely valuable. we take a question, quiz of facebook, look up directions, companies direct our information, sell our information without consent to a third party and then use our information, our information for a range of activities from showing an ad for clothing to sometimes more dangerous like targeting disinformation to manipulate our behaviors or believes and it's all invisible
3:25 am
to the true owners of the data, i want to focus how data privacy or the lack there have impacts democracy or elections. recently, our committee held a round table on the creation and disemination of disinformation on elections and the pandemic and this disinformation targets spanish-speaking communities. we learned how the disinformation is then exported to other states including new mexico and then amplified based on this data harvesting we heard about today and read in this testimony. ms. fitzgerald, you noted that actors can recognize our data to undermine election integrity and democratic institutions. in practice, what does this look like? >> sure, thank you representative, just as
3:26 am
advertising companies use profile to manipulate us into purchases, so too can they manipulate our viewers by filtering the content we see. the mark-up, after january 6th, showing the two different facebooks republicans and democrats were seeing, it's just a really stark view of, you know, kind of the different information that both sides are seeing and how that shapes your views. even if someone doesn't click through to an article, just seeing that headline, having it in the back of their head as they're voting, these companies are kind of able to manipulate our political views by determining what we see. >> ms. zuboff, i really appreciated your written remarks about how democracy is simultaneously the only legitimate authority of halting the surveillance capitalism but also a prime target so i
3:27 am
appreciate that and welcome you expanding on that, but first, mr. castro said a private right of action is unnecessary and should only focus solely on transparency, do you agree, and if not, why? >> you need to unmute, professor. >> sorry. i don't agree with that. i think the private right of action is very important. when we pass a law, whether it's the california law or any other law, that's only the beginning what has to happen is these laws
3:28 am
are tested and develop and as a society we learn what they can mean for us and how they're going to protect us in society. so what the private right of action does is it creates the opportunity not only for individuals, but for groups of people, for collectives to really bring issues into the judicial system to have those issues explored, and to create precedence. and this is, you know, this is what's called a life of the law, how the law evolved and how we can move forward into this century, not just with statutes that are frozen in time, but with laws that are evolving according to what justice brandice once called, you know, the eternal youth of the law because we have these kinds of
3:29 am
international processes. so this -- >> yes and i wanted to -- i completely agree. our civil rights laws would have had no effect if we hasn't had the right of private act, and i did want to ask you, though, to elaborate in the few seconds left with regards to what you believe most important to protect our democracy in an online privacy act. >> first thing is we notice so much of our discussion has been we minimize data or how is the data used and things like that. once we start a discussion talking about data we already lost primary ground. the key thing is now that we are all targets for massive scale secret data collection, much of which should not be collected in the first place. the decision should lie, the decision rights should lie with the individual. do i want to share this information about the cancer that runs in my family and do i want to share that with a
3:30 am
research operation or federation of researchers that are going to make progress on this disease and help the world? maybe so. but do i want google or facebook or any other company to just be inferring information about my cancer from my searching and my browsing? absolutely not. so we need to reestablish decision rights as justice douglas offered in 1967 the idea that every individual should have the freedom to select what about their experience is shared and what remains private. these decision rights are the cause of which privacy is the effect. we need to establish now, finally, in law, juridical rights that give us the choice of what is shared and what remains private. then, we've got a whole new ball game where all kinds of innovations like mozilla and all
3:31 am
the mozilla that are waiting to come on stream, not obviously just in search and browse but all kinds of businesses in every sector. they're all waiting to get in the game. >> thank you, and i know we have overgone our time. thank you for madam chair, and i yield back. >> gentlelady yields back. i just have a few final questions. first, thanks to all of the witnesses for really, very interesting, excellent and enlightening testimony. you know, i do think, you know, i wonder, mr. irwin, in your testimony, you refer to dark patterns as one of several privacy abuses that demand regulation, that technology alone couldn't adequately address. can you explain in further detail for the committee and for those watching the hearing what dark patterns are, what are some of the most troubling examples
3:32 am
of dark patterns and what is the best legal approach to reigning them in? >> yeah, so dark pattern is a sort of broad umbrella term used to refer to what specifically sort of user experiences that can deceive a user in the opt-in data collection or consenting to data collection without being explicit about what is really happening under the hood and a dark pattern might be just text intended to deceive the user or might be you got to click through five things to opt out of the data collection. all of those are typically, sometimes referred to as dark patterns, we see those everywhere across the web and as i mention, that's the kind of area where the browser can't do much when the user is actively engaging with the first party. so i think we really do need to see regulatory advisement,
3:33 am
actually acting to say here's what the standards should be and here's deceptive practice and here's a sound practice. that's where i think we really do need to see regulatory measure snooze standards instead of design. >> yeah. >> you know, we talked a lot about the rights of individuals for privacy and, you know, some of the disinformation, digital addiction, harmful impacts on children, on certain internet platforms, but i think, you know, whether this should be a private rights of action, but i think the core issue really is, as mr. halberd said, if you don't have the data, you can't use it to manipulate. and i think it's my view that the victims of manipulation, whether it's for political, societal, cultural or commercial purposes, it's not just the
3:34 am
individuals being manipulated but the public at large. if you are manipulating for a commercial purpose you are a big tech company and got all this data, you know, you're at an unfarad vantage to smaller companies that have not acquired all of that data. if you're manipulating for a cultural or societal purpose, to move society in one direction or the other, the public is also a victim of that. individuals have been removed from their agency, and have been less free because a private corporation is manipulating them through their own data. so let me ask you, ms. fitzgerald. isn't really the crux of this to restrain the collection and retention of data? >> yes, absolutely. it's not enough for our users just to know what companies all
3:35 am
collecting about us, it has to be restrained. obligations have to be on the data collectors to limit what they're collecting. >> i also think, i want to mention briefly and maybe any of the witnesses can talk or ms. fitzgerald may have studied this in your nonprofit, i'm a fan of ftc but have been advised many have served there, the sheer number of individuals technically savvy is fairly minimal compared to the actual army of, you know, digital engineers, softwear savvy people in these large companies and that frankly, they're no match for the gigantic companies with the computer digital expertise those companies have which is why we're looking at how to better arm a regulatory agency
3:36 am
to actually be successful in facing off with these gigantic corporations. is that capacity of the ftc off-base, mrs. gerald, in your view? >> absolutely, i mean look, we're encouraged by recent actions of the ftc on privacy but ftc has limited resource and see incredibly broad mandate from antitrust to horse racing safety, you know, the task of data protection is best done by a specialized independent regulator. when you think about the outside presence of technology in our lives and economy, i think this is something where 20 years down the line, no one will question why we have a data protection agency just as now we don't question why we have a faa or bpa. >> let me just ask a question of you, mr. irwin. in your written testimony you
3:37 am
state, quote, it is important that competition concerns not be a pretext to prevent better privacy for everyone. can you explain that? tension and how congress might strike the right balance on that? >> yeah, this is something we see fairly oftenput basic challenge we have here now, is the internet was designed in a very permissive way that allows a large number of parties to collect data, not only the larger parties but also the smaller ones and closing some of these privacy gaps sometimes means denying data to the large parties and small ones, sometimes we hear the argument that we shouldn't close the privacy gap because that could have competition implications like that idea that we should leave the internet more permissive in order to protect some set of business models. you know, what we want to see is an overall more protective platform that has an even playing field that all big and
3:38 am
small companies can get on, that's the approach we i think are in favor of and will push back aggressively on any suggestion which leads to privacy holes in the browse order operating systems in order to protect some business model competing against big tech. >> right. i'll close with this. you know, i live in california. the california law has not really stopped the kind of collection i think its authors intended. i will say however, despite some decline to regulation, the formation of businesses in the tech sector is at an all time high. business is good, actually, job creation in silicon valley is carrying the entire state in terms of the job creation. so it has not had these horrible impacts in the tech sector. i'm mindful that if we constrain
3:39 am
the collection and retention of data by internet companies especially in the internet space, it will require a change in their business models and i think that's necessary. because right now, the propensity and capacity to manipulate every person in america is unacceptably high. so i wonder, professor zuof, you talked about in your new york times essay recently about how democracies are confronting the tragedy of the uncommons and how we need to get back to protect our society for lack of a defined thing. can you expand what is the most important thing to accomplish
3:40 am
that goal in your judgment? >> well, you know, the digital century opened with great promise. it was supposed to be the democritazation of knowledge. i'm not ready to let that dream die. the problem is we have gone on this kind of accidental path in my written testimony, i give a lot of background as to how that happened. a certain market ideology, a certain national security concerns the accidents of financial emergency in silicon valley and how surveillance capitalism was invented in early 2000 to get google over the hurdle when investors threatened to pull out block, a bunch of accidents and that's why i call what we are an accidental utopia, but what we really have is an opportunity
3:41 am
for the digital century to be a century where data is being used to further the needs of society, to solve our core, most important problems. and data collection is being used in ways that are most aligned with what we consider in the digital century to be necessary fundamental rights. this is a work that we have not yet undertaken, we got to figure out, it would be like living through the 20th century without having tackled workers rights and the institutions to oversee all of that. we created all of that in the 20th century. we're in a new century. new material conditions, new kind of capitalism, we got to do that is kind of fundamental work all over again. and that primarily, right now,
3:42 am
this in-depth census of how artificial intelligence is developing around the world. the ecosystem of artificial intelligence and what it shows very plainly is the five big tech companies own almost all of the artificial intelligence scientists, the science, the data, the machinery, the computers, everything pertaining to artificial intelligence is concentrated in a small handful of companies and all that knowledge and capability and material is going to work to solve the commercial problems of these companies and their business customers. surveillance capitalism. it's not being used for society. so the public is being, not only is agency, being subtracted from individual life, but the benefits of the digital are being sequestered from the life
3:43 am
of our publics, our societies, our democracies. we have the opportunity to get that back and get us back on track to a digital century that fulfills its promise of knowledge, democritazation and really solving the problems that fact us. that's what it's all about. >> thank you so much professor, and thanks to all of our witnesses. as i mentioned, the record of this committee will be held open at least five days. committee may have additional questions for you and if so we will send them to you and humbly request that you answer them and i think that we have advanced our knowledge of this situation substantially today, due to the very helpful and thoughtful testimony provided by these excellent witnesses. at this point, our hearing will
3:44 am
3:45 am
3:46 am
3:47 am
of the screen control your personal view and can be switched between active speaker and gallery view. this is your individual view and will not affect anyone else or what the public sees during the proceeding. we ask that you please leave your screen gallery view to you can see the timer and physical low members. note the timer is its own square and will only be visible if you're in gallery view. it will remain there throughout the event. all participants are automatically muted. in a moment i'll mute any participants that have taken themselves off mute. all participants are asking to keep themselves muted when not engaging in the


info Stream Only

Uploaded by TV Archive on