Skip to main content

tv   Rob Reich Mehran Sahami and Jeremy Weinstein System Error  CSPAN  November 23, 2021 9:03pm-10:35pm EST

9:03 pm
9:04 pm
what happens when a philosopher, computer scientist and a political scientist, who also have deep experience in leading tech rooms and government come together for our lives, politics and even our values being affected by technologies in ways that we are just beginning to understand? while, together rob reich and jeremy weinstein created a highly popular for set a standard university, and now a new book called "system error or where big tech went wrong and how we can reboot." from their ongoing work in the program today, they will explore major issues around the impact of the rise of decision-making by algorithms, the amount of user data collected by big tech companies, increasing automation sand proliferation of a
9:05 pm
disinformation online and they will also talk about potential answers to important questions such as how can we exercise our agency and reinvigorate our democracy and direct the digital revolution in a way that will best serve our collective societal interest. to lead the conversation, we are thrilled to welcome back another stanford expert. first, rob reich professor of political science director of the ethics and director of the institute for human centered artificial intelligence. 96,407 words in their new book system error. 10.24% probability with the last name starting with the book cover. one hundred for the current age of his grandmother who lived through two pandemics, foresees of diamonds good foror his first
9:06 pm
real job ass a highschooler at a local jewelry store. and last but not least, 13, love of my lucky number. next, introducing mehran sahami, professor for education in the computer science department at stanford university and also former scientist at google. five, the ag immigrated to the u.s. from iran. one hundred, dollars amount received for his first programming job. at 2.2 million views on youtube in his first lecture of the computer programming methodology course. 2,000 volunteer code in place of ta's who taught 22,500 students during the pandemic and aid, the total number of covid vaccine doses taken by four members of his household. so glad to have you here, mehran sahami. next, jeremy weinstein, professor of political science,
9:07 pm
director at stanford studies and fellow at the institute for national studies and the stanford institute for economic policy research at stanford university.t one, the number of times posted on a social media platform. in 1984, the year he moved to palo alto, california. more than 250 additional students that he teaches when he has -- when he teaches about computer science into political science department. forty-four, president barack obama, for whom he served on the national security council as chief of staff and deputy to the u.s. ambassador to the united nations. fifty-one, number of wins by the san francisco giants halfway through the 2021 season. go, giants. welcome, jeremy. and last but not least, to lead the l conversation, marietje schaake, the policy director at the cyberpolicy center and international policy fellowic at the institute for human centered artificial intelligence at stanford. ten coming years as a representative in the european
9:08 pm
parliament. two years at stanford, including 1.5 remotely. first book written. 24,300 people followed on twitter. and 020, the old phone code for amsterdam still used to refer to the city. so glad to have all four of you here today. we are looking forward to your conversation. over to you, marietje schaake. a. >> thank you so much. and also a warm welcome to everyone joining. itve is a real celebration today as molly and i want to congratulate you on the official launch of the book, system reboot, where big tech went wrong and how we can reboot it. and i can't stress enough how important it is to hear such a diagnosis, well researched, well argued for americans. i say that being in europe now but of course the world is looking at what the united states is doing both in terms of strength coming out of silicon valley but also the harm and the ripple effect that that has
9:09 pm
around the world. it's important to hear this from professorsnt at the very university that has birthed so many big tech ideas, leaders, business models, technology and disruptions. and i've had the pleasure of seeing and hearing the analysis that you can now all read, and i've seen it evolving over the past year and a half or so. so, i'm going to be comfortable calling you by your first names. we are colleagues, even if the travel ban forces me to be at a distance. but when we were all at stanford, i had the pleasure of guest centering in the popular class mentioned and that utah also together on ethics, public policy and technological change. and as some of you may know,s te public and policy is especially close to my heart and high on my priority list. so, i hope we can learn more
9:10 pm
today in this conversation about how private governance may impede public values and democracy as such. i would like to invite each of you to briefly share what you see as the main system error. and i would like to start with jeremy, then hear from bob and close this question with mehran. forve me, the main system errors that technology as it is currentlyn being designed incurs a set of values, and those values are often in tension with other values that we might care about and of the challenges that as technology is being designed, the people making choices about what values to optimize for or
9:11 pm
the technologists that he has nine technology into the companies that provided the environment in which technology is being created but where is everyone else in this conversation? what we get from the aggregation of data that makes possible all sorts of incredible services and tools with a coast to privacy. or what we get when algorithms are used to introduce extraordinary efficiency into the decision-making processes and raise questions about fairness and bias and due process. when we havewh these kind of trade-offs, where is the collective and the system error is that the politics is not playing a central role in refereeing these value trade-offs. that we've left these decisions to a small number of individuals whoum work in tech companies wee run in the tech companies to make decisions that have massive implications for all of us, not only of the individual benefits
9:12 pm
that we realize any technology, but also social effects and increasingly social harms that begin no longer avoid. >> all right, let me help in i will put it in a way where i don't have a policy background, and i don't have a technical background. my initial exposure to all this was as an ordinary user of the tools and technologies that came out of the silicon valley. and from my point of view, the previous area is the one that is so important to become a private government. the big tech company is located here in the valley, partially perhaps in seattle. kind of an acquired, something akin to a graph over our lives, our lives as individuals and as citizens in a democratic society andoc the programming dated, the countercultural attitude of silicon valley were those programming dated those that
9:13 pm
have now become the goliath and the pandemic has actually reinforced which is we were already aware of some of the problems of big tech 18 months ago but now our lives have become even more enmeshed in the decisions made by a very small number of people inside a very small number of tech companies. and through the work lives in a private lives and educational lives and the power in a tiny number of people amounts to privateal government. >> how we actually think about the systems that got billed, and i think often times in the technology systems, built with particular metrics seem to mindd
9:14 pm
is that are easily quantifiable and the notion that we want to scale. if we have the measures that are easily t quantifiable its proxis for the things we want and if you get the scale, then what happens is what you are measuring and what you want begins to diverge more and more. you may want to have people say take pleasure from using something but the way you measure that his time on the platform, and that isn't really whether or not they are getting a benefit from it it is just a thing that is easy to measure. when you take that a scale there's a greater divergence between the things we want and of the things that are the outcomes. from the standpoint of people who build systems when we thinkt about what do we test and measure it's not just about the systems working were simple metrics it's understanding from a broad picture we need to understand more about the societal impact that we have when we build for example machine learning models into the
9:15 pm
impact theyua have on people rather than just an error rate that we can measure in the lab. those are the kind of things we need to bring into the discussion as the technology gets built out and deployed even further. >> thank you. i think it means a great picture of the angle that you've each taken and what makes the combination of those views so special about the book. early on in the book, you portray students that have developed a very disruptive startup, but you really see his attitude as representing a trend in silicon valley, so i'm going reto quote a small piece. he lives in a world where it is normal not to think twice about how new technology companies can create harmful effects. so, i want to ask may be a little bit more of a critical question and that is to what extent has stanford university, where each of you have a long history and a lot of experience working with students, to what extent has stanford actively and
9:16 pm
successfully created the culture where disruption is admired, giving access to the companies in silicon valley for students but also inviting ceos and others executives and to teach. i would like to hear a bit of a reflection from each of you on how stanfordrd plays a role in that typical disruptive ecosystem starting with rob and then concluding the question with jeremy. >> stanford absolutely there is a significant responsibility for some of the problems we see in big tech and for that very cultural orientation to aspire towards disruption to look for ways to code in your pajamas and roll the product out quickly and see what happens, throw at the wall and figure out any problems downstream. amwe've seen that personality or at least i have in teaching
9:17 pm
generations of different students because in the past five years, the kind of professionalization that even young students come to stanford and engaging with the venture capitalists and finding ways to leverage their ideas. i once had in my office hours a first quarter 18-year-old kid whod had taken a political philosophy class and i was making small talk with him, what do you think of majoring in and he said i'm definitely going to be a computer science major and i said that's reallyth interesting. he said i have some startup ideas already. making small talk and continued on. what is your idea and he said while, he looked at me earnestly in the eye and said to tell you i would have to ask you to sign a nondisclosure agreement, and that kind of attitude just
9:18 pm
reflects the celebration stanford has had for a long time of the disruptive he those. and i will wrap this up by saying from my point of view it gives stanford a special responsibility to respond to some of the things that it's participated in creating and, i mean, that not only is the products that might have been developed and brought to scale but the cultural orientation that is present and rather than just the pantheon of heroes typically the founders of the companies and whose names everyone knows when they are at a stanford, we need an alternative and at the passage that you've quoted about this disruptive founder is matched with a short profile of another person who imagined himself as a civic technologist by the name of aaron schwartz. we need more air in schwartz is
9:19 pm
and if you will marie of the world to be inspirational figures for the young students at stanford and for technologist at large. >> to take a little bit of historical perspective, i'm sure there are people in the audience that remember 20 years ago, there was actually a drop in the number of computer science majors and a worry as to whether or not computer science would continue to grow in the field because there was such a precipitous decline in the number of students going into it so both stanford and nationally there were efforts to try to increase the number of people going into technology. and i'm happy to say those efforts while we succeeded in the number of computer science majors, it's grown by about 350% in the last 15 years, the largest on-campus boots for men and women about one and five students at stanford majors and a part of that transformation
9:20 pm
was talking about the world changing ability on the technology how to make an impact on big problems through the use of computing and i think that is still true buti what we've also woken up to in the last five, six years more clearly is what are the real social impacts of the technology scale. what does it mean for other values we care about two things like privacy and personal autonomy. as those things come into play because stanford is an educational institution that behooves us to integrate that more into the educational process for the students so we can balance both the power computing because it is a powerful platform with a the social responsibility that comes with the things that we do when we build out these large platforms. so i think that is the place where stanford clearly has the responsibility on both sides, those that have gone out toec build these things and now we see some of the consequences but at the same time understanding that and charting a better path
9:21 pm
forward so we can actually teach both of the next generation of technologists and the work we do with people in the industry to think about more about the social consequences of how we can mitigate the negative effect. >> something stanford is a microcosm of what has been happening more broadly into technology. it's played a critical role in bridging the ecosystem that is silicon valley and it wrote this period of tremendousf tech optimism and we saw that reflected in every aspect of the campus. about riding that wave of optimism and backlash, we now face exactly that dynamic playing out on campus. a lack of trust in the tech companies and the concern among the individuals and the majors about whether they want to be associated with these societal harms. so it puts us in the position of
9:22 pm
educators on campus where we have to think and dig deep with our students around the questions of how do we amplify the benefits of technology while also mitigating these evidence harms. so the book itself calls for this kind of nuanced adult pragmatic conversation that's not a tech optimism and that's not a tech to pessimism, but recognizes that technology itself isn't neutral and involves trade-offs. it generates benefit alongside of the challenge collectively to think about how to weigh those things. >> but often i would say on this point, while we can focus on stanford and think about its potential role are talking about the tech companies we can focus onmp the corporate leaders and other ceos, the big picture arguments of the book is that iindividualizing these kind of challenges misses the bigger challenge that we face, which is that the system error that we are describing is a function of
9:23 pm
an optimization mindset that is embedded in computer science and technology that basically ignores the competing values that need to be refereed as the projects are designed. it's also embedded in the structure of the venture capital industry that is driving the growth of silicon valley end of the growth of these companies that prioritize the scale before we even understand anything about the impact of the technology and society and of course it reflects the path of that has been paved for the companies to market dominance by the government that is largely an retreat from exercising any oversight. so yes, stanford has an important role to play and there are individuals who may be looked aside and didn't pay attention to the potential consequences at a moment of tremendous tech optimism. but the bottom line for us is there are systemic challenges that need to be addressed and
9:24 pm
stanford needs to be at the center of addressingo those challenges but stanford cannot do it alone. >> iou think that makes a lot of sense and i've recognized even if i've only been at stanford for about two years, that sort of change in the type of questions people are asking also after january 6th, the storming of the capital that people really thought the problems are not just elsewhere in the world or other communities but they can truly hit home and the harms are real, they are not virtual so to say but we also see how hard it is for students to find jobs outside of the big tech companies that offer great salaries when people have student loans to deal with so maybe we can touch upon that a little bit later. for now, i want to focus on the bigger question of governance because i see governance as one of the main themes on the book. on the one hand, you have many
9:25 pm
anecdotes and stories about how the tech companies evolved into technology itself set standards and how injecting venture capital dollars steer the company isdv in a different direction but you are also quite critical about the abdication of responsibility on the part of political and democratic leaders and you write and i quote the abstract virtues of democracy are difficult with a few harsh realities that appear again and again when it comes to governing technology. this technical ignorance of politicians who were hardly credible in providing oversight and complicating the regulation call and get it done. on the other hand it's also the walls that should be updated and regulations that should be
9:26 pm
adopted for change i hope to reach lawmakers in dc and perhaps overseas. what can we expect, and how can there be a rebalancing among the technology knowledge and engineers. i'd like to ask each of you to focus on sort of one priority and one thing you would change if you had a magic wand when it comes to and i will close with jeremy. a. great question. i think if i had a magic wand of the thing i would focus on is moving the locust of the agency all about individual choice and what is collectively possible.
9:27 pm
to use a particular app or believe in the existing app. or what kind of information you are giving out but that places all of the onus on the individual to have to try to protect things they care about. if you were worried about driving safely then don't drive. there's traffic lights and speed bumps and that all comes about through regulation to say we are going to create a system with
9:28 pm
guard rails foril safety and nou within that we can still choose to drive or not. if that is the place i would try to get to is the conversation of this is just about individual choice into thinking about how we have a larger system that provides those guardrails. >> i am tempted to say something very specific that is on the possible horizon and the algorithmic auditing or antitrust action against the company's but the philosophical route. the optimization mindset of the technologies. as one of the colleagues at
9:29 pm
stanford tells students everything in life is an optimization problem then you look at the democratic institutions. democracy is not an institutional design that missed in the first place in my view to optimize. but for the messianic conflict inc. the new challenges in a new day so i would try to use my magic wand to keep the optimization mindset in its place as it were as a technical orientation. because but is ultimately at stake is whether or not the companies when technologists do work there care about the integrity of the democracy.
9:30 pm
the shortage of my own time in washingtonon and it's hard not o look at what is happening in washington and see tremendous paralysis. one of the places we started the book is by trying to focus people's attention on the alternative to re-energize the democracy. it's leaving the set of decisions about how technology unfolds and how it impacts society leading those to the technologists into the companies building the technologies and i think we have a pretty good preview in the last decade and a half of what the consequences
9:31 pm
are. for the investors in silicon valley and anti-regulation push tithat the regulation is just going to slow down the innovation and get in the way of sort of the progress that we need in the economy we need to understand that as a rejection of democracy. it's a rejection of the role of the political institution, which is the technology thatch we have selected as a society to help us rrefereed the critical value trade-offs that exist. now you are right to say our democracy isn't up to the task not only because of the lack of technical knowledge that you pointed to but because of the institutional features that have made progress very difficultal o
9:32 pm
achieve. at the questions about the content and decency act about one of the places that we start that is excerpted in the piece in the atlantic was published a wcouple of days ago is can we find a set of areas for legislative action where democracy can do what it is best at, which is achievera consensus on the most evident harms that need to be avoided in the near term, because there is agreement among both parties into the constituencies that supporthe those parties of the changes that are required and we identified three areas that we think are those areas of agreement. one is around the data privacy and the huge asymmetry of power that exists between companies that access the data and individuals that have to navigate an outdated regulatory
9:33 pm
model.l. second is about the algorithms and high-stakes decision-making's whether it's credit or access to medical services or criminal justice system. the lack of transparency, due process, fairness, kinds of considerations that are being brought to the rollout of the systems that all of us should be concerned about as it displaces human judgment in our lives into than third is the affect and growth of ai on the workforce. while in the near term it makes sense to think that they are going to be mostly felt among the low-wage workers and so we need to think about the social safety net and its evolution with respect to job displacement in reality over the medium-term, this transition to the power of the economy is going to affect everyone's job. we say this to the computer programs in the class that the tools are going to be remaking their jobs in the future and we need an orientation from the
9:34 pm
federal government and states that recognize how do we prepare people for this. opthat's not to say that we dont think antitrust is absolutely central or content moderation and we go to those issues in the book that you are asking what is achievable in the current political moment, i think we need to avoid the tendency in the ocean and try to solve every problem with a sort of sweeping piece of legislation on the futurere of tech and to pick out those areas immediately to show progress as possible and can flex its muscle. >> as someone with a couple of scars from having served in public office, it sometimes is to defend elected officials when this comes even if i completely recognize in dc and sometimes wonder is that because of the relation of certain tech
9:35 pm
applications if they came too late and has it exacerbated the and of the entrance meant that we currently see certainly in the united states, but i do think that what you shared about this sort of momentum that is building is also more universally true. policy usually responds to the changing realities and when you look around any representation whether it's a city council or the u.s. congress or the european parliament for that matter i think there are few experts. there is a problem with other topics as you asked a set of farmers how to date the knowledge is on agriculture i'm sure they wouldn't be impressed. we also have to find ways to work with the generalists or expertise that often exist in the representation antidemocracy
9:36 pm
rather than expecting that all stanford grads will run for office anytime soon although it would be a nice solution i think to some of the problems. >> anyway, sorry for the distraction. i wanted to go from asking you all to respond to my questions to asking each of you a specific question. then we had quite a few from the audience pouring in and i would hope we get to start with those for all of you with a question, feel free to jump them into the q and a function. i'm collecting them as we go. the first builds on what you said because you have seen from your past job as representing the administration how technology and big tech are disrupting more fragile societies not just american
9:37 pm
democracy but also emerging economies and the world's most vulnerable. can you speak to where you would like to see the leadership on the global scale when it comes toli dealing with these topics r can you speak to why it matters globally what happens in the united states? >> i appreciate your point on expertise our job is to navigate a complex array of j issues and represent the constituencies that have sent them into office but we need to think about how we design the political institutions to ensure that the expertise that is available to them isn't only coming from the paid lobbyists thatt represent a particular point of view and i think that is where we are falling short in our democratic institutions that with the set of elected representatives and
9:38 pm
to speak to issues of the public interest rather than the corporate interest dependent on the expertise that comes from high paid lobbyists. it is something that we absolutely need to address. >> i really appreciate the question that challenges us to put this issue in a global perspective and may be the first and most important thing t to sy is the harms that technology is causing are not just filled by citizens in the united states but people all over the world because the u.s. happens to have given birth to powerful company whose choices are shavings the billions of people around the world. so the absence of leadership from our democratic institutions and thinking about how to
9:39 pm
amplify benefits and mitigate harms have implications not just for americans. we talked about disinformation and misinformation with respect to the american election but of course this is played out another electoral prospects. we think about the use of the communications to mobilize against citizens by vigilante groups. theot promotion of hate speech d incitement on the social media platforms and other countries of the world, and. of course the impact of ai on jobs in future network isn't uniquely american job either. maybe the gap between the reality of the current moment and a futureom that some of us might want to envision is the
9:40 pm
fact that most people have no say whatsoever in the way that technology is impacting their lives. so if you are impacted and living in south asia or africa or latin america, these are replaces where because the tickt companies are not headquartered there and because the governments feel at the mercy of the providers to pave the way for the company is to operate at you as a citizen and elected officials don't even haveia a voice. i'm not suggesting some government regime is going to make up for technology. i think ultimately the responsibility for governing technology in this case begins with the actor at the table which is the united states, which is where these companies are headquartered.f but one of the most important steps that needs to happen is instead of really seeing this primarily through the lens of the competition with china which is what you see often brought to
9:41 pm
the table as just another recipe for light to the tech companies do what they want because otherwise we will lose in the race for innovation with china. i think we need to recognize the harm is real and ultimately we have a set of democratic government that share values that care about things like justice and fairness and well-being rather than oppositionor to one another and ultimately that is the challenge the biden administration conference.tr but we shouldn't pretend that it's a simple problem to solve, because in fact the distance between the european union and urthe united states regulations right now may be as wide as the distance with of the nondemocratic regimes around the world. that is we have the consensus around what the problem is and how to regulate. but we need to move towards a much more open and transparent conversation about the values that are at stake, and we need
9:42 pm
to recognize that the choices that needed to be made legislatively and by the executive branch are choices that need to be coordinated with others and need to create ways to understand harms that are not only felt by americans but are filled by people all around the world. >> absolutely. well said. i want to go to you with this popular phrase thato i hear more and more human centered when we talk about technology. and i see ithn often as being a kind of reassurance about how ai and other technologies can actually be kept in check. humans should have the last word were the last to say or have oversight. what are your thoughts about human centered ai into the role of people in these increasingly automated processes and how do we avoid that it isn't just the
9:43 pm
sort of pr or catchy slogan that loses meaning? >> i think thi that is something that makes the problem better when it doesn't do anything. one of the things to keep in mind for the human centered ai, we have a human in the loop for the decision-making and what is moresi fundamental is thinking about the participatory process which we think about people participating in what they would like to see built and that the actual impact and downstream problems that it may generate so it gets back to the question of who controls the technology or is there actually an opportunity to get broad participation in thinking about the design. i think that is the key to the human centered ai is thinking about the learning and of the
9:44 pm
air rate is not an error rate. when you make a mistake about whether or not someone remains in jail in the criminal justice system, that's not .01%. that is someone's life you disrupted. then we are not being human centered. it's algorithmically centered because we are thinking about what can the algorithms do and the technology as opposed to the impact on people. it think there's good examplesf this because one of the things that i think we try to take this balanced approach of the book that it's not just big tech bad or big tech good. it's even in the same time you see places where the technology is being thought of any human in a humancentered way and place it's not. amazon.com has billed some resumes for hiring because hiring thesesc days especially large companies with a
9:45 pm
high-volume process. so by building a model with people that they hired before can they screen resumes if it first had a high likelihood of being hired that they should dprobably get an interview and what they found. it's one of the most technologically sophisticated companies on the planet. they look at this and they try to mitigate the gender bias in the modeling and ultimately they concluded that they couldn't do it so they scrapped the model. that is an example of human centered ai and what the impacts are. they test this out and have a bunch of gender biased decisions being made and decided that if you cannot fix it, you shouldn't deploy it. they are affecting real people's lives, so i think that there are examples of this.
9:46 pm
we need to be broader than thatt when we get back to thinking about. what do we need to haveef guarantees on before we deploy something out of scale, then it's not only provides a mechanism for policy makers or collective decision-making to impact what is happening in technology but a way to help also educates technologists about what is important about the human centered technology. and the next startup is coming along to ability to use of technology it would be great for them to know here are the kind of things you need to watch out for and where things have gone wrong in the past and how people have actually mitigated so i think the learning is actually by a directional process.
9:47 pm
>> when i hear you including people being represented to think about how the technology might work in the communities or society or questions around testing the impact first i wonder whether you foresee sort of on boarding were bringing in house some of those otherwise more democratic processes. the trend that we see with facebook developing and oversight board and the sort of process that mimics the checks and balances but is still under the roof of the company. and just curious how you see that. do you foresee the companies trying to implore some of these government functions more formally or would you says we need to also create a wall or distance between what companies decide and what type of oversight is applied for
9:48 pm
example. >> i think both things need to happen. we need more emphasis in terms of the design. to see how i built how are people harmed by this technology and where are the benefits and the trade-offs and populations that are being served and who is not being served by that of technology. one of the things that also happens is in terms of the imprint of who is in the tech sector, that affects who is building the product.
9:49 pm
to come up with solutions that work for everyone. it's about self-regulation. what i would make is if you come up with a set of policies why wouldn't he want that codified as the regulation and additional people this is something that makes sense to level the playing field to cross the entire industry. so if executives in the company change and have different ideas aboutut what they want to do wee new interests come into the field, we have some rules for
9:50 pm
everyone. so i think the notion of self-regulation versus government regulation is being independent. we began this effort about four years ago and engaging undergraduate as at stanford as we saw future technologists needing to t engage in these issues in a way that we are talking about today in the intersection of the technology owdesign and of the normative values and ultimately how it affects society. but we saw this sort of unmet demand among the professionals to also be having these conversations. so we began to teach in the evenings creating space for people in tech companies to have these kind of conversations and
9:51 pm
its striking to us the extent they were not having those conversations that felt deeply disempowered in their own companies they were part of and we often hear what happens when you're thinking about a new market and of the unintended consequences orn when you have o make a decision around who should benefit and who should be harmed in the design of a particular algorithm and you need to apply some criterium but who do you call, who makes those decisions and the answer is we call the office of the general counsel because ethics is a compliance function in most companies right now. it's what is allowed by the law and then we understand the freedom to move. i think that t is no longer acceptable. we know the limits of what the political system is going to be able to provide in terms of
9:52 pm
guardrails and guidance and i think the key to the book is the argument that alongside the change that needs to happen in washington is change that needs to have been in the field of computer science in terms of cultivating the ethical responsibility and in the structure of business and i think we are already seeing a preview in the stricter of business about what the pressure looks like from workers, the talent that companies need to attract. it's not okay if you sell this product to anybody. i am deeply concerned about the impact on the democracy of the choices that we are making with respect to political ads and content moderation and so the push and pull and give and take is the moment we are at no one should walk away from the discussion today thinking that
9:53 pm
we are of the view that somehow regulation is going to comeon in and solve these problems. ultimately, the big argument of the book is that there is agency for everyone. there's agency if you are a technologist to ask questions and challenge your colleagues about the potential harm. >> that is a great addition and preview for the question i wanted to ask who was an expert in ethics as you also referenced the notion of ethics being a solution and appointing an ethics officer and principles and so on and so forth. i wanted to ask what you see as
9:54 pm
the merits but also the pitfalls of how ethics is being invoked and used currently as a part of the whole discussion around problems that big tech presents to us. so first of all, just as jeremy said, we don't want ethics to be you asked the general counsel and it is just a compliance function.
9:55 pm
push ethics all the way down the design system and to ensure everyone has the permission to raise the question and has a framework for thinking about. ethics is small or at least it isn't fairly interesting to think that the ethics amounts to having a personall moral compas. it comes down to if we have virtuous people to acquire the power and then we cannot allow their goodness as human beings to shine through and will be okay. personal ethics in my view is of course essential but not a very interesting. elizabeth holmes is going on trial this week in san jose and is of course the dropout founder roof there and knows companies
9:56 pm
to have a fraud at its heart. in the position of power, we know it happens but ethically uninteresting. there's no argument yet you probably learned when you are a 5-year-old in kindergarten you don't have an ethics class at the university. so don't focus on personal ethics either. just as democracy is designed am a system that doesn't as soon people are saints, we should think of a tech ecosystem that doesn't depend were shouldn't have to depend on people being virtuous people all the time. so what then do we need to.
9:57 pm
it goes back to this idea of professional ethics and a set of screamers for thinking about a socialist political ethics and the ideas of the value trade-offs. i i will and here because we talk about the political ethics and the ideas of values that essay y say onething about professional. partly in response to a bunch of scandals whether it b is world r ii and not the era experiment in methe concentration camps or the united states, the tuskegee experiments that expose a bunch of medical experimentation. we have now developed a deep
9:58 pm
institutional footprint that goes from classes that you encounter as a student in medical school to the ethics and hospitals to what is called an institutional review board if you ever want to do experimentation on human subjects, you have to do an ethics pass through about it to federal agencies that are past passedwith overseeing the develt and relief of new drugs on the market. there is no equivalent of that kind of professional ethics footprintr within computer science or ai. we are in a kind of early phase of the maturation of the professional athletics of the field and in the book, we played the way forward in a number of different fronts about how to accelerate that so that computer scientists can begin to develop a much stronger professional ethic that is at an institutional footprint as elwidely developed.
9:59 pm
>> i was looking at the last question that i will ask first now too real because it built on what you first said and there's a ton of questions. this one is for you. does it make sense all degrees include a mandatory ethics course? >> that sounds lovely, but it's not enough. i got my ethics injection as a course and now i'm good to go. it has to be something that is an embedded in the curriculum rather r than any single class. >> first question came in before anybody said a word, which is showed at the big tech companies be broken b up? >> i don't think that it is a solution, and if you think about what are some of the problems that you want to try to solve with breaking up a big tech company, so let me give you a simple exampleou if we were to
10:00 pm
break up facebook woody to suddenly lessen the amount of disinformation, probably not. it would probably exacerbate it because you have smaller companies dealing with the same problem and the solution is not we are going to allow some platforms to have this information and some to have less when we really want in a ecosystem with a lot less disinformation overall. now at the same time there s isa question around competition, so i think that one of the places antitrust plays ahe role is in terms of greater scrutiny. so, when facebook wants to acquire in instagram, there would be greater scrutiny to see is this anticompetitive, is there a way that we can think about limiting the concentration of power in particular companies so that there are more consumer choices? the thing about the consumer choices the classic framework
10:01 pm
for antitrust doesn't work. it's about the consumer pricing. so we need to take a different frame, which is what are the harms that we see happening at those may not be about pricing. they mayri be things about like what happens to information ecosystems. they are more cautious around future acquisitions. competing with other companies because they had a say in the antitrust action taken before even though it didn't result in a substantial breaking up of the company. so i think if we just show that there is at least that a greater scrutiny we will get greater competition and less of the concentration of power and we might also see the company's
10:02 pm
taking steps although this one is a little bit more debatable as to whether or not the products they produce are better in mind with what the consumers actually learn. >> i saw a question which is reminding us microsoft is one of the biggest companies in the world and to question whether the antitrust had any significant effect on its market power. you get the emergence of other companies like why would they become such a power house and why did applepo take off, becaue there was probably i would imagine a lot more scrutiny in microsoft internally in terms of the competition in the marketplace as to what kind of
10:03 pm
stand so they took for being aggressive towards the competitors. i think that is what we saw is that there was kind of a aggressiveness in the marketplace regardless of whether or not the company was broken up but it was the change in the attitude that happened as a result. >> this one is for jeremy. why are both republican and democratic administrations unwilling to stop big tech from acquiring the small companies and what could become competitors. any thoughts on the difference in the u.s. being republican and not republic of a sorry, not a democracy modifies the perspectives on policies, so basically how should we interpret the differences between the parties or how should we look at the system and understand why we are where we are? >> one of the things we do in the book is the long running
10:04 pm
race between disruption and democracy. the moment we are in there is a deeply historical aspect so let's look back 150 years to think about the way in which technological innovations over time transform the economy and society going back to the telegraph or think about the origins of the telecommunications, interstate telecommunications infrastructure in the 1890s and 1910. when youte look at it in this historical perspective you understand in a system that combines the kind of free-market orientation with democratic institutions that provide oversight and constrained you fget extraordinary innovation n the private sector often with government support and investment to be clear. then you get all these benefits that appear in society and you begin to get some awareness of the harm it often does have to
10:05 pm
do with market power but sometimes other externalities that unfold, so the dominants for example of western union into the telegraph system given the need to basically lay down expensive infrastructure and produce something that enables us to communicate across the states but a single company that is in control. the same thing plays out in the railroad and then the ability to use the infrastructure to maintain that control and how do they maintain that control? they maintain the political system so you have democracy struggling to keep up with disruption and then you get these policy loans or windows when in one fell swoop, the government tries to solve all of the problems of the market coconcentration or the harmful effect of technology. the problem is that then you are fixed into some new framework hathat can't adjust the changes that are five or seven years down the line in the future so, where are we now with respect to big tech why are republicans and democrats not taking action n on
10:06 pm
big tech? we are told at the beginning of the opening policy because we had a bipartisan consensus that began with the democrats in the clinton administration and then follow it through the successive administrations and then into the obama administration that basically it was better than a harmless, it was amazing. it was changing everything in ways that benefited everyone so people wanted to do a bear hug of the technology companies. they wanted to google on one side and apple on the other side and to think about the potential of the technology, the economic growth potential and who looked at the regulatory momentum that was building in europe when you were in office and thought they are only doing that because they don't have any tech companies of their own so they try to constrain this engine of american innovation and power. we are in a new moment now.
10:07 pm
although they diagnosed the problem in slightly different way and so we find ourselves in a policy window where you will actually see bipartisan momentum to tackle the problems of big tech. the question is where can we come to agreement on the problems to be solved and how we should solve it and that's what we spoke to earlier in terms of what are those low hanging fruits. it doesn't mean the democrats are not going to continue to bang the drum's and they won't continue to raise concerns about censorship. it will continue to have a partisan element but the challenge in the current moment is to take advantage of the policy window that we are in to address the harms that are visible today and also to think about how we put ourselves into a position as a society where we do not only have moments of regulation that happen every 30g years when all of the harm is evident but how we actually reengineer the democratic
10:08 pm
systems to be much more flexible and adaptable. the one comment i would add on this question of we are a republic, not a direct democracy what are the implications, the implications are the power invested in the hands of citizens is to throw out your elected officials if you don't like what they are doing. if you are unhappy with the effects on society you can be angry at the leadership. but not only at them because ultimately, the failure to mitigate the harm of technology is a failure of the politicalti system and so there's a need in the context of the republic to get anna formed to develop a view on these issues and hold politicians accountable for showing leadership and mitigating these harms. >> thank you.
10:09 pm
very inspirational thoughts everybody can and should take responsibility. this is back to rob. there's quite a few parallels being drawn with past lessons so i will join up on a couple of questions. here we go. for over 30 years the governments have failed to take appropriate action onn climate change and now with our back against the wall, action is still insufficient. given that, why do you think the governments will be able to take appropriate action on ai and big tech and then the other question, that i think is related is that there are parallels between big tech today and the financial markets in 2007. the prevailing thought was that the regulation would stifle innovation in financial services but the result was credit default swaps, mortgage-backed securities individually market collapse. coming from big tech what more might it take in other words, climate regulations or the lack
10:10 pm
of action in light of big challenges and lessons learned from the banking crisis why would now be different? >> these are important questions and hardly unfamiliar concerns with any of us watching what is playing out on the global scale and the climate, big tech and other things. so, to quick responses on my part. number one, we've been living in the united states, but more broadly in many democratic ysocieties, through an era of great skepticism about the capacity of the state itself to carry out big projects we used to go to the moon and now we let the billionaires build the private space exploration. we wanted a great infrastructure plan but the last meaningful think we did in the united states was the highway system or something in that neighborhood. i think that we are possibly in the early moments of exiting from a 30 year period of just
10:11 pm
skepticism if the state is capable of grant projects and part of that in the united states especially will be to rise to the challenge of tackling some of the most obvious authority problems that we are all aware of climate change and concentrated power of the big tech firms that is perhaps a bit more of the ho family empirical based predictions, but i would say we don't really have many other options to hang onto. is where we as ordinary people can invest our time and veinvest our aspirations. second and final observation is to say it is a design feature, not a design flaw of democracy that it tends to be short-term. we looked in a small number of years to either reelect or to throw it out and bring someone
10:12 pm
else in. and as a consequence, they don't have the easiest incentive structure in place for the elected officials to think about a longer time horizon and climate in particular is one of those things where it is easy to imagine paying short-term costs without a politician will have to be the consequences for the atthe next election in two years or where the gains may not be seenn for ten, 20 or 30 years. so, part of the task of the democratic institutions is to try to come to terms with a longer time horizon orientation for thinking about planning, through thinking about infrastructure.nd some of the things we talk about in the book our efforts that pointed to a very particular proposalsal that give democratic institutions the capacity to think about the law rather than to always be reactive and that is where i think we need creative and innovative minds at work to be installed at the type of operating system within the democracy so that it can handle
10:13 pm
these long-termse challenges. >> i coincidentally was a reference to this speech ecyesterday around the book commission economy, and it almost sounds like you've read it and you are inspired by and i would recommend you read it. with a longer horizon that excites bureaucracies and leaders and people on the other sides of the aisle to come together and become more innovative to solve a problem instead of you know, doing more mundane parts of policymaking anyway. indeed it is that type that would reinvigorate towards more innovative capacities. on another. pick up
10:14 pm
on the societal damage while building the technology-based services, butas that's differenf than building for the purpose of achieving societal benefit init the first place. can we envision a society where that is possible? >> it is a great question and i think building technology for a great society is possible and aspirational and so i would to say that right now we have a lot of technology for which we have identified harm and so the first thing we can do or that we need to do because that is more pressing is to actually find ways to mitigate the harm there's lots of low hanging fruit in that area and if we can get that done, but that shows is the ability of democracy to rise to the challenge. i feel the same way watching the
10:15 pm
news there is a f skepticism of what can actually be done. if we understand there are things that can be done, that gives the optimism to do more later and in the long term though, i do think that there is a notion of how to rebuild technology for good and you see signs of that on college campuses right now. we have an organization that is healthy and vibrant. one of the things they do is look for those that can be used for societal benefit and they do things like partner with ngos and think about what kind of technology they need and how to build prototypes for them to show them and defend in some cases work the following through to see what can be deployed. so i think that is a great area but i don't think we should just look for that as a way to solve the current problems. in the educational systems and other structures there's other ways that technology companies have gotten involved in this.
10:16 pm
this question is from jeremy, with politics will our democracy be able to reflect this complexity? >> pointing to this challenge earlier, so maybe i will say a few more words about it now, politicians are generalists. while they might have expert staff that work with them, the staffing in government is then, and it's even thinner if you think about the state institutions were local o institutions in comparison to the federal government.
10:17 pm
so we ask the politicians, those that represent the interest and that are meant to do the hard work on things we might value or finding a way forward in our society to do the impossible, which is to educate themselves senough about what you care abot and understand what the effects of the regular rate change might be added to take a position while forging the consensus or compromise with others with a really limited staff at their disposal so the organized corporate interests have used this to make sure that their voices are heard and they do an extraordinary job of that and we call that lobbying. it's the payment of expert staff that show up in the executive branch and offer a framework for rklegislation and provide facts and information to help legislators make the judgment. the question is who is lobbying for theby public interest.
10:18 pm
where are our collective interests being accounted for in the system and when you think about the technical knowledge and expertise, for example, if she was of the body is an algorithm or thinking about the structure of the platforms into the business p model of platfors can have successful platforms without requiring the monetization of personal data or what isa even possible with respect to content moderation in a world where you can't rely on the human judgment but you need ai to undertake this task, who is providing a different set of views to the politicians to counter or challenge theo view that is offered up by the lobbyists paid by the tech companies? the failure of the system is a failure by choice. it's not that they couldn't have that expertise. it's that that expertise does one example, an institution on the congressional side was
10:19 pm
literally shut down during newt gingrich's republican revolution, the contract with america. newt gingrich said we don't need technical expertise in congress, people can get technical expertise in the market. so we have made a systematic tedivisions in the federal government over time to erode the technical expertise and knowledge. that means we put our politicians in the position of primarily receiving input and advice from those that are paid to represent the interest of company so while lots of us are focused on the issue of the privacy legislation or ai regulation or upscaling in the future of work, i don't want people to lose sight of the fundamental task that we have, which is building a democratic system that is capable of governing technology before it governs us and that means thinking about the future of the federal civil service. it means thinking about what expertise we recruit for and it
10:20 pm
means thinking about the creation and support of independent institutions that represent a public-interest and in our complex system which as we said before is the best technology that we have come up with to deal peaceably in a way that it tries to forge consensus or agreement among these views, we need all of the developments working and weave under invested in them up until this point. >> thank you. a building on this is a more philosophical approach to the question. your central thesis is that the company directors are making these decisions for us but isn't the decision being made by the collectives by their free choice to adopt these things by which i think they mean technologies and services? let's be particular about this. facebook i is a global platformf
10:21 pm
more than 3 billion active users and there's exactly one person who has de facto complete control over the information environment. the question to consider, all of us either choose to use or not use facebook and so does that also come down to the choice of the 3 billion of us who have signed onto the app or the platform and use the service i see the attraction of the idea but it presents us with a false choice. first of all, our contention in the book is that 3 billion the people and one person in control is too much power for one person or for one company. when you reach great scale as these companies have, the kind
10:22 pm
of private governments that we started with rise to the floor and call out to the scrutiny and criticism and then the idea that it all comes down to us as individual users is we think the mistake as well. of course we as individuals can decide to delete facebook as cksuggested, but that goes backo the start of the conversation, namely if for some reason you didn't like the cars that were available were the roadways, if you felt like there was a modest amount of dissatisfaction with the idea of getting inn the car and going somewhere, you could decide to just not to drive andt if you had other options like public transportation or rideshare services, maybe there would be viable other options. but the idea that we should basically tell people you can take the car on the road as they are or you can just not drive is equivalent to saying and for
10:23 pm
what it's worth, facebook would love it if it felt like it was for every single person to decide what to do on his or her own. what they really need is a collective measure from democratic institutions and civil society organizations. that's the path forward, not describing these massive system problems as the choices of consumers, whether o to use or t to use. >> it was too early to predictpr the detrimental effects of the personalization over relevance. how do we avoid such in the future before they scale? >> in hindsight it's a lot more clear. part of the issue is
10:24 pm
understanding what can potentially happen. what you think would happen in the system where we just kind of let people loose and information starts to spread and then how responsive are we to it so back in the 2016 election when people were beginning to call out the fact that there was influence in the election before it even happened in the electoral process, the response from the platform was this can't be that big of an effect. so there was a skepticism that came with even having a negative 'timpact and it's not that we would expect them to figure out every negative impact but to be weary and think about what the impacts are and do some evaluations on themselves withti what they are actually seeing like if you look at a lot of studies on the platforms around
10:25 pm
for example whether or not there is bias many of those are done by academics. why aren't they being done by the companies themselves? they have better access and as you can imagine either they can start to do that themselves to understand what's going on or we can have a function to say these are the things you need to monitor to be able to understand what is going on. so those are the kind of things that are in place. the lessons learned from some of the larger platforms now that can be put in place and also it makes sense it's being able to learn from the experiences of the past to inform what we do in the future. you can choose what apps you want to use it completely undermines the notion that there are lessons we've learned and we
10:26 pm
should put those into practice through the democratic institutions. let's learn from the past, not ignore it and have the same problems over and over.r. there will be places companies don't get it right. but in order to be responsive early on and to take the view of being self-critical as opposed to self-aggrandizing in terms of what is being accomplished, it's not just changing the world for the better resolving other people's problems, it's being caught in the fact that you are creating problems ass well. >> i think that is a really great and and a brief summary but there's so much more. the discussion was rich. thank you for answering y all te questions and for jumping back and forth. before we close, and we are almost at the end, the museum has an initiative and they call it the one-word initiative. the idea is that you each wrote
10:27 pm
down a word or will quickly write down one word of advice and it would be the kind of advice you would give to a young person starting out in their career. then once you share this one word please share a little bit briefly about why it is a telling advice that you want to share and then we will conclude with of that and i will start with you. >> my word is the unexamined life is not worth living and of course the lesser known parallel, the un- lived life is not worth examining. don't be an ethical sleepwalker. a acquire and examine within and that a is a pathway i think forn entire life, not just at the beginning. a. >> wonderful. jeremy.
10:28 pm
>> my word is go deep, it was going to be one word but you need a verb exclude go deep and units motivated by the idea that both when i was looking for my own career path and as a professor and someone who served at the senior level of government i would see generalists everywhere i look. people that have invested in checking every box and trying out everything who present themselves to learning something hiquickly and rapidly being deployed, and i always hire the people who've gone deep and i always wanted to be as a young personhe the person that had goe deep and the reason is because if you really want to solve problems in the world, you had to get beyond the top layer of understanding and you have to texercise in terms of the thinking about how to solve a problem you have to care as much about the idea as you do about the implementation so my advice
10:29 pm
is to pick something that you care about and look at it from every angle and tried to actually move the needle on the outcomes before you move on to your next t thing so that you bring into the world every subsequent job that you have the experience of having seen something from start to finish. >> wonderful. very powerful. >> the word i would use is kindness. the reason why is twofold. for someone that is young and starting their career i would tell them a little bit of kindness goes a long way and it will have a large impact in your life. it doesn't cost anything to be more kind but the dividend will pay you and people are tangible and meaningful. at the beginning of the pandemic it was interestingmi because you saw a lot of kindness coming out in the world. people were bulging out for
10:30 pm
plother people. people were cognizant of trying to help others who may be were not in a situation as fortunate as they were and i hope that is one thing we can take away from thee pandemic for all of the problems that it causes and for all of the stress that it's createdt on relationships, to think about kindness both towards other peoplees andca ourselves because the world can use more kindness and it doesn't cost you anything. >> that is a great note to end on. ..
10:31 pm
you look at a complicated set of issues, any technology, anything that can be weaponized and inevitably will be. i think the analogy was interesting for me as you think about history and implications and the direction it can provide for the future. it turns out 100 years ago more or less the court system, the corporate veil in terms of liability for auto dealerships into the manufacturer's where the liability through automobiles and automobile accidents. there are structural and code issues with life as we know it, regulatory social codes and
10:32 pm
values but software codes and algorithms driving things in the future. i do believe many of the points made in particular the one relative to the problems we are trying to solve type of achievable outcome in the effect of the workforce and the human conditionha, i encourage all of you to read the book do your best to help facilitate the process whereby people have a sense of where they are in a world of computing and they can have agencies make them. wonderful program, i appreciate all of the work and wonderful moderate programs. thank you for draining and i encourageyo you to contribute, o to our website if you love these programs, we can use your support so thank you, best wishes and be safe. ♪♪
10:33 pm
>> weekends on c-span2 earn intellectual peace. every saturday, american history tv documents american stories. sundays, tv brings the latest nonfiction books and authors. funding for c-span2 comes from these television companies and more including media,. ♪♪ >> the world changed and instant the media comports ready. we never slowed down. schools and businesses went virtual, we powered a new reality because we are built to keep you ahead. >> media, along with these television companies support c-span2 is a public service. ♪♪ >> sunday december 5 config in-depth, victor davis hanson drives us to talk about more politics and citizenship in the united states.
10:34 pm
his book titles include the father of all, the case for trump at his latest, the dying citizens. the ideals associated with it disappearing. running the conversation with your phone calls, facebook comments, text and tweet. sunday december 5 at noon eastern on book tv. get your copies of the book. >> ever wonder what it would be like to go back in time to relive history and benefit shape the future? we have an unusual opportunity. 1985, one of the first reports of the tech industry, began a decades long story of people and companies from thisng book recounts history of the tech sector from the beginning.

12 Views

info Stream Only

Uploaded by TV Archive on