Skip to main content

tv   Rob Reich Mehran Sahami and Jeremy Weinstein System Error  CSPAN  December 14, 2021 8:27pm-9:59pm EST

8:27 pm
office. the new podcast, presidents recording her. >> focus on the presidency of lyndon johnson you'll hear about the 1964 civil rights act, the 1964 presidential campaign, not ever knew they're being recorded. >> certainly johnson secretary there transcribing many of those conversations can be through an open door between his office and there's because she also hear some blunt talk numbers. [inaudible] i promise you all uncle right
8:28 pm
behind. >> see spent now mobile app. this. >> what happens when a philosopher, computer scientists and political scientists who also have a deep experience in leading tech firms and governments come together in our in politics are being affected by technology in ways we are just beginning to understand. create a highly popular and how from their own work in this program today are going to explore major issues the
8:29 pm
proliferation they point to view the how it will serve our collective societal interest. here at ch i'm thrilled to introduce our speakers first rob is a professor of political science, director of centers for epic size society and director o for humans ten-point to 4% the probability of the last. [inaudible]o and then the courage of his grandmother who lived through twoo pandemics.
8:30 pm
the diamonds cut clarity, cut and color at a local jewelry store. and last but not least fred love of his lucky number welcome rob. from the education the computer science department five the age of any imagery to the u.s. was first a lecture on the programming methodology who taught 22500 students during the pandemic were so glad to have youla here. next jeremy weinstein the
8:31 pm
stanford institute of policy research at stanford university. one the number of times posted on social media platform covid 1984 in california to a and 50 that he teaches when he is teaching about the computer science. and the political science department. woody for president barack obama for whom he served on the national security council's chief of staff and to the investor to the united nations, 51 the number of the san francisco giants half into the 2021 season, go giants. welcome jeremy. and last but not least, lead this conversation she is the international policy director of the cyber policy director international policy fellow artificial intelligence at
8:32 pm
stanford. ten years as a representative and european parliament comment two years atar stanford including one half remotely. first book she is writing, 24003 new people following on twitter 020 the code for amsterdam still referred to the city. so glad to have all four of you here today. were looking forward to your conversation. >> thank you so much and for my signed also a warm welcome to everyone joining. it is a real celebration today so i r went to congratulate on the official launch of their book, system reboot were big tech went wrong and how we can reboot it. i cannot stress enough how important it is to hear such a diagnosis, while research well argued from americans. i say that being in europe now but the world is what the united states is doing both in
8:33 pm
terms of strength coming out of silicon valley but also the harms and the ripple effect sets has around the world. it's important to hear this from professors at the university that has birthed so many big tech ideas, leaders, business models, technologies, ands disruptions. and i've had the pleasure of seeing and hearing bits of the analysis you can now all read. i see it evolving over the past year end a half or so. so i'm going to be comfortable calling you by your first name or colleagues even for travel but forces me too be at a distance. we are all at stanford to have the pleasure of guest lecturing and that those just mentioned when you also talk together on ethics, public policy and technological change. some of you may know the public policy is especially close to my heart and high on
8:34 pm
my priority list. so i hope we can learn more today in this conversation about how private governance may impede democracy as such and of course how we can repair the broken machinery. so to start us off i would like to invite each of you to briefly share what you see as the main system error. i like to start with jeremy, then hears from rob enclose this question, over to you jeremy. >> thank you very much thanks for the computer history for helping us for this book launch a venture. the main system error is technology as it is currently be designing hers a set of values. those are with other guys we might care about the challenges out of technology is being designed the people
8:35 pm
making choices about what values to optimize where the technologist who designed technology and the companies that provide the environment in which technology is being created. but where's everyone else in this conversation? the system of failure is when there are tensions between values, between what we get from the aggregation of data that makes possible all sorts of incredible tools but comes at a cost of privacy. or what we get when algorithms to introduce efficiency into decision-making processes but raise questions about fairness and bias and due process. when we have these kinds of trade-offs, where is the collective? the system error is politics is not playing a a central role in a refereeing these value trade-offs. we have left these decisions to a small number of individuals who work in tech companies or run tech companies to make decisions
8:36 pm
that have massive implications for all of us. not only the individual benefits we realize in the technology but also social effects increasingly social harms that we can no longer avoidid. >> are right, let me hop in there i'm stuck in the queue. i will put in the way i do not have a policy background i don'tvi o have the technology background of the tools and technologies that came out silicon valley. undermine points of view it's what's important to me, private governance. the big tech company located here in the valley partially in seattle have acquired something akin to a grab over our lives. our lives of individuals and citizens in a democratic society. the programming, countercultural attitude of
8:37 pm
the early age of silicon valley have now become the goliath. in the pandemic has reinforced for me were a already aware of the problems of big tech our lives to become an even more enmeshed in the decisions made by very small number of people inside a very small number of tech companies. even 18 months ago dependent on a small number of platforms for our private lives for educational life. that power that's invested not when a tiny number people in silicon valley amounts to private governance. that is not serving up the citizens very well at all. >> from a technologist point of view technology systems get
8:38 pm
built with metrics in mind that are quantifiable does not necessarily match the things we really want they are proxies for the things we really want. when you get this scale what sometimes happens is what you're actually measuring your divergent more and more. you may want to have people say take pleasure from using some of them but the way to that.e that's not really are not whether they're getting a benefit but something is easy to measure. we take that to scale there's a greater divergence between the thingse we once in the things that are the outcomes. and from the standpoint ofen people who build systems we think about what we test, what do we measure? it's not just about us are metric this understanding from broader picture we understand
8:39 pm
more about the societal impacts we have. the actual impacts they have on people rather than the error rate we measure. most of the kinds of things we bring to the discussion of technology. >> thank you. it paints a great picture of the ankle should each take and the culmination of those that is special about the book. in the book you portray a very disruptive start up. but you really see his attitude is representing a trend in silicon valley somebody quote a small piece. he just lives in a world where it is normal not to think twice about how new technologies could create harmful effects. so, i went to ask a little bit more how has stanford has a long history and a lot of
8:40 pm
experience in working with students to what extent has it successfully trade the very culture or disruption is admired giving access to the companies in silicon valley for students but also ceos and other tech executives to teach. i would love to hear a bit of a reflection from each of you on how typical disruptive echo system starting with rob concluding for this question with jeremy. >> stanford absolutely bears a significant responsibility for what we see and big tech and that varied cultural orientation to aspire to look for ways to roll your product out and see what happens, throw spaghetti on the wall and figure out any systems
8:41 pm
downstream. i have at least in teaching generations because in the past five years the professionalization even young students come to c at stanford with an eye toward engaging with the venture capitalists who populate campus and leverage their ideas and they started. i once had in my office hours a first-quarter o 18-year-old kid who'd taken a political philosophy class came with me him to office hours a chat about that part as making smallim talk with them what you think you'll be measuring in, he said i'm deafly going to be a computer science major isa great that's really interesting. he said to have some start up ideas already, making small talk continued on. what is your idea? and he said well, he looked at me earnestly in the eye and said to tell you i would have to ask you to sign a nondisclosure agreement.
8:42 pm
that kind of attitude reflects the celebration he's had for a long time of the disruptive. i will wrap this up by saying from my point of view it gives stanford a special responsibility to respond to some of the things it has participated in creating. i mean that not only is a set of products and might've been developed at stanford and brought to silicon valley but the cultural orientation that is very present here at stanford. rather than a pantheon of heroes typically white guys who were founders of companies in his name everyone knows when they are at stanford, we need an alternative the disruptive founder smashed with a short profile of another person who imagined him self by the name of aaron.
8:43 pm
i'll take a little bit of orhistorical perspective. continue toue grow as a field there such a decline in the number of students. they're both at stanford and nationally their efforts to try to increase the number of people going into technology. i am happy to say those efforts will wildly succeed in the number of science majors at stanford's room by about three and a 50% in the last 15 years. it's the largest on-campus for men and women one of five
8:44 pm
students in computer science now. part of that transformation is talk about the world changing ability on technology. and i think that is still true. but i think we have also woken up to in the last five, six years more clearly on privacy and personal autonomy to integrate that more into the process for our students. it is still a very powerful platform with the social responsibility that comes with the things that we do without these large platforms. it's the place where stanford clearly has responsibility on both sides on the developing technologies and now we are seeing some of the
8:45 pm
.onsequences and charting a better path forward. both the work we do and how we could mitigate the negative effects for. >> a microcosm it also rolled this period of tremendous tech optimism to extraordinary institution where we saw that reflected in every aspect of campus. in writing that wave of tech optimism and confronting backlash that exact dynamic playing out on campus. whether they want to be
8:46 pm
associated with the societal harms. it puts us in a position as educators on campus where we have to think and dig deep with our students. those around the questions of how to amplify the benefits of technology while also mitigating these harms. the book itself calls for a nuanced adult in our challenge collectively think about how to weigh those things. we focus on the corporate leaders and ceos, the big picture argument of the book is that individualizing these kinds of challenges this is a
8:47 pm
bigger challenge we face. there is an optimization mindset is embedded in computer science that's also driving the growth of silicon valley and these companies that prioritize the scale be stanford has an important role to play. and yes there arere individuals who may be did not pay attention to the potential consequences in a moment of tremendous optimism. but the bottom line for us as
8:48 pm
there a are systemic challenges that need to be addressed to. stanford needs to be at the center of addressing thoseo systemic challenges. but stanford cannot do it alone. >> thanks i think t that makes a lot of sense. i recognize even if i've only been at stanford for about two years the change the after the generally sick storming of the capitol witht information the problems are not in the world and other communities they can truly hit home for the harms are real. they are not virtual so to say. but we also see how hard it is for students and offer great salaries some people have real student loans to deal with. maybe we can touch upon that a little bit later. for now i want to focus on the big question when the main
8:49 pm
themes of the book. many anecdotes and stories steers and companies in a different direction. it's also quite critical on responsibility of political and democratic leaders. with a few harsh realities again and again when it comes to governing technologies. the technical ignorance of elected politicians isol hardly credible on how trade-offs should be made on the other hand it's truly a call to action. also one that has laws that
8:50 pm
should be updated, regulations for change. i hope to reach lawmakers so that they will jump to action. how can there be a rebalancing? is it feasible to have a rebalancing among politicians but also democracy among engineers one thing you would change if you had a magic wand i went to close with jeremy for this question. >> if i had a magic wand the thing i would focus on is moving the locust of thinking of agency engaging with
8:51 pm
technology is made at the individual lever were kind of information is going out the places the onus on the individuals are what they have to care about. think the bigger question is have a system that helps the individual. it's the roadway system. we don't just put people in cars and say drive safely. and if youvi are worried about driving safely then don't drive it. that will be a ridiculous choice is real value and driving. so what we do? we create a whole set of infrastructure there's traffic lights there are speed bumps
8:52 pm
that all t comes about through regulation we create a system has guardrails over safety. now within that you could still choose to drive or not. you can think about how you can drive safely. this driving safer for everyone. i think that is the place i want to get to is going to the conversation of this is just about individual choice think about how we have a larger system that provides for those guardrails. >> i am tempted to say something very specific on the horizon in d.c. or elsewhere auditing the federal privacy regulation. in a way that my magic wand is you invited me too do to point at this optimization mindset of the technologist. when that mindset comes in outlook of life as a whole l
8:53 pm
everything in life is an optimization problem. when you look at the democratic institution as a failing to optimize the things we caren about. you complain about the inefficiencies, the drag and the lack of intelligence perhaps of elected leaders in our democratic institutions. democracy is not meant in the first place in my view to be optimized. it's a way of a refereeing the messy and conflicting presence of citizens in a way that's constantly adaptable to new challenges and a new date. so what i would try to use my magic want to do is to keep the optimization mindset in its place asin it were but not with the outlook on life. what's ultimately at stake is whether or not the big tech companies who work there care about the integrity of theirra
8:54 pm
democracy and maintaining the help of doom democratic institutions here or abroad. >> was hard not to look at what's happening in washington and see and it's hard not to see the polarization in the united states and feel like absolutely nothing. when the places we sort of the book is trying to focus people's attention on the alternative to reenergizing our democracy. the alternative is leaving the decisions about how technology unfolds and how it impacts society. leaving and those decisions to the technologist and the companies building the technology. i think we have a pretty good preview of the last decade and
8:55 pm
a half or two decades of what the consequences are, of leaving the decisions you hearil from corporate leaders or investors in silicon valley and tight regulation push, a view that regulation is going to slow down regulation is going to get in the way of the progress we need in our economy. we want to understand anti- regulation push as essentially a rejection of democracy. edits o are objection of the w role of the political institution which the technology we have selected as a society to help us referee the critical value trade-offs uethat exist. not only because of lack of technical made progress very difficult to achieve.
8:56 pm
everyone's reasonable expectation this was our current system at the moment. about the future of antitrust, questions about content moderation and communications ecdecency act. one of the places we start which is excerpt can we find a set of areas for legislative action where democracy can doti what it's best that which is achieve consensus on the most evident harms that need to be avoided in the near a term. the changes that are required one is around data privacy of the others the symmetry of that exist between companies
8:57 pm
the second is about the use of algorithms and high-stakes decision-making settings the lack of kinds of considerations that are being brought to the rollout of these systems is something all of us should be concerned about as it displaces human judgment in our lives. the third is the fx of ai on the workforce. the effects are mostly felt among lower wage workers and respected job displacement. in reality this transition on the ai a powered economy is going to affect everyone's job. we say this to the computer programmers in our class that ai tools are going to be
8:58 pm
remaking their jobs in the future. we need an orientation how do we prepare people for the future? the content moderation are central. we go in depth on this issues in the book. if you're asking what is achievable in our current political moment i think we need to avoid the tendency to boil the ocean and try to solve every problem with a sweeping piece of legislation on the future of tech and to pick out to show progress is impossible and flexed its muscles were. >> thank you. met with a couple of scarves by elected officials when this criticism comes even if i completely recognize in d.c. and sometimes wondering the
8:59 pm
regulation of certain tech applications came too late it exacerbated the divisions in the entrenchment we currently seat certainly in the united states. i do think what you shared about the momentum that is building is also more universally true. policy usually responds to changingng realities. when you look around any representation whether it is a city council, the u.s. congress, for the european parliament for that matter, i think there are few experts. the executives point to a problem others, you ask a set of farmers how up-to-date the knowledge on agriculture i'm sure they would not be too primpressed. i think we also have to find ways with thehe generalists are the variety of expertise so to
9:00 pm
say that often exist in representation in a democracy rather than expecting all stanford grads will run for office anytime soon. although it would be a nice solution i i think to some of the problems. anyway, sorry for the distraction. i wanted to go from asking you all to respond to my questions to asking each of you a specific question. and then we had quite a few from the audience pouring in. i think we can start with those and for all of you who are wondering on your question feel free to drop them in the q&a function. i am collecting them as we go. so the first question also kind of builds on what you just said, jeremy, is for you. you have seen, from your past job is representing the u.s. uadministration with the obama administration at the united nations, how technology and big tech are also disrupting
9:01 pm
moreoc fragile societies not just american democracy. but also emerging economies in the world's most vulnerable. can you speak to where you would like to see american leadership, on the global scale, when it comes to dealing with these tech companies? : : why it matters globally what happenr point of view and that's where we are falling short in the
9:02 pm
institution and the sort of senior executives and government without an authority to generate a technical expertise that speaks to issues of the public interest rather than the corporate interest dependent on the technical expertise that comes from lobbyists. it is exacerbated in the united states into something that we absolutely need to address. i appreciate the question that challenges us to put this in a global perspective and the first and most important thing to say is the harms that technology is causing whether intended or unintended is felt by people all around the world to give birth to the companies whose choices are shaping the lives of billions of people around the world and so the absence of
9:03 pm
thinking about how to amplify benefits and mitigate harms has implications not just for american citizens. we talk about disinformation and misinformation but of course this is played out in other electoral contents. we think about the communication to mobilize violins that is committed against citizens by vigilante groups and the enslavement on social media platforms and other countries ie the world. maybe somebody that served at the united nations i can say one of the reasons it exists is there is a view after world war ii that needed to create a forum in the world where everyone had a voice and may be the gap between the reality of the current moment in future some of
9:04 pm
us might want is the fact some people have no say whatsoever in the way technology is impacting their lives. if you are impacted by technology and living in south asia or africa or latin america, these are places where because the tech companies are not headquartered there because governments f are at the mercy f the providers of these products to pave the way for these companies to operate you as a citizenze and elected officials don't even have a voice. i'm not suggesting some regime is going to make sense for technology. ultimately the responsibility in this case begins with of the absent actor at the table which is the united states. one of the most important steps that needs to happen through the lens of competition with china
9:05 pm
is a frame use the often brought to the table which is just another recipe for let the tech companies do what they want otherwise they can lose the race for innovation. we need to recognize the harm is real and ultimately we have a set of r democratic governance that share values and care about things like justice and fairness and well-being and care about privacy and autonomy and these governments need to be working together rather than working in opposition to one another and ultimately, that is the challenge the biden administration conference but we shouldn't pretend it's a simple problemtr to solve. it may be as wide as the distance with nondemocratic regimes around the world. that is we haven't forged a consensus around what the problem is and how to regulate, but we need to move towards an
9:06 pm
open and transparent conversation about the values at stake and recognize the choices that need to be made legislatively by the executive branch and united states are choices that need to be rdcoordinated with others and create the space to understand harms that are not only failed by americans but people all around the world. >> now this phrase that i hear more and more is human centered when weut talk about technology, and i see it often as being a kind of reassurance about how ai and other technologies can be kept in check. humans should have the last word or say. what are your thoughts about human centered ai.
9:07 pm
one of the more important things to keep in mind sometimes it is interpreted as decision-making and what is more fundamental for the human centered ai is thinking about a participatory process and about people participating in what they would like to see built. it's an opportunity for broader participation and thinking about the design. that is the key to the human
9:08 pm
centered ai and learning models. an error rate is about human law being affected so when they make a mistake about whether or not someone remains in jail and criminal justice system, that is and .01%, that is someone's life disrupted so unless there is a real understanding of that then we are not being human centered. there is a good example of this because one of the things we try to take this balanced approach in the book because it's not just big tech bad or good. it's about you see places where the technology is being thought of any human centered way so i will give you an example be webrought up in the book. amazon had to build the systems to screen resumes because hiring
9:09 pm
these days is such a high-volume process, so by building a model because people they hired before then they should probably get an interview and what they found in building the system is that the model had a gender bias so it down ranked that included the names of women's colleges and so it's onele of those companies on the planet that they looked at this and tried to mitigate. so they scrapped the model. they understand the impact and have a bunch of gender bias decisions being made and ultimately they decide if you can't fix it, you shouldn't deploy it.
9:10 pm
these are a the kind of things that we want to hold up how the sector itself can work on these issues but when we get back to the point of thinking we shouldn't just rely on someone doing this but the fact that there's some guidelines around how the models should be evaluated and whether we deploy something at scale it not only providesin a mechanism for poliy makers or collective decision-making to impact what's topening in technology butpe help educate technologists about what's important about the human centered technology and the next startup coming along to build a piece of technology it would be great for them to know here are the kind of things you need to watch out for and where things have gone wrong in the past and people have mitigated. so i think the learning is by a directional process.
9:11 pm
>> when people are represented to think about how the technology might work in their communities or in societies or questions around testing the inside first, i wonder whether you foresee bringing in-house some of those otherwise more democratic processes and the idea of including the voices in the decision-making andnd policymaking and the trend that we see with facebook developing and oversight in the sort of process that mimics the checks and balances but is still under the roof of a company, just curious do you foresee the company's more formally or would you say we need to also create a distance between what the
9:12 pm
companies decide and what type of oversight is applied. >> i don't just mean i've done some fieldwork what features people using what they owned but trying to understand how our people harmed by this technology and where are the trade-offs into populations that are being served and who is not because one of the things that also happened is in terms of who is in the tech sector, that affect us who is building the products and for whom they are building the products, there are some people that get excluded from that whole effort. if we take a broad look at how wewe take people into the field whether they are technologists to talk about the harms, that is a place we can take better
9:13 pm
action andti get better technological outcomes. i don't see a celebration in the sense of the government is over here. give and take between the technological sector to come up with solutions that work for everyone sometimes what they do is it's about self-regulation. it's going to lead us to do what we do because we come up with good answers and i think that's an interesting viewpoint. that's great if you come up with a set of policies that are for self-regulation why wouldn't you want that codified as the actual regulation and saying this is something that makes sense to level the playing field across the entire industry so if they change anddi have different ides about what they want to do we
9:14 pm
have some of those rules for everyone. toed engage in these issues in e way that we are talking about today in the intersection of the technology and the normative values and creating space for people in tech companies to be having these kinds of
9:15 pm
conversations. disempowered in their own a company as they were a part of in fact we often hear what happens when you think about selling a product to a new market and unintended consequences or when you have to make a decision around who should benefit in thee particulr algorithm and you need to apply some fairness criterion but who do you call and the answer is we call the offices of the general counsel because ethics is a compliance function in most companies right now. it's what is allowed by the law and once weha know what is allowed, we understand the freedom to run and freedom to move. i think thatt is no longer acceptable. we can't have ethics as a .compliance function because the limits of what the system is
9:16 pm
going toli be able to provide in terms of guardrails and guidance. it is an argument that along the side of the change that needs to happen is change to happen in the field of computer science in terms of cultivating an ethic of responsibility and the destruction off business and i think that we are already seeing a preview about what this pressure looks like. i'm not going to want to dedicate my time and energy or i am concerned about the way in which this algorithm harms some populations were i'm deeply concerned about the impact on the health of our democracy of the choices that we are making with respect to political ads or content moderation. so that push and pull and give and take is the moment we are
9:17 pm
at. we are of the viewin that somehw regulation is going to solve these problems. ultimately, the big argument of the book is there's agency for everyone. if you are a technologist to ask questions and challenge your colleagues aroundco the potentil harm. there's agency for you if you lead a company to think what is the bottom line of metrics that you're optimizing for. >> that is a great preview for the question that i wanted to ask who is an expert in ethics and we do often here as you also referenced the notion of being a solution, appointing an ethics officer or counsel adopting principles and so on and so
9:18 pm
forth. i wanted to ask what you see as the merit and the pitfall of how the ethics is being invoked and used as far as the whole discussion around problems that big tech presents to us. >> first, two things ethics are not or should not be and how in the book and at the classes that they teach we trye to frame tha. we don't want ethics to be you ask the general counsel and it's just a compliance. in a similar spirit you don't want the company to appoint an officer as you could just abstain from any personal responsibility of your own in the company as long as you brought it up the chain for the ethics officer and that person can decide for everybody. we need a collective orientation
9:19 pm
as jeremy said, where if you wish to push ethics all the way down the design system of companies and product development it will ensure everyone has the permission to raise ethical questions and has a framework for thinking about the kind of concerns that getet brought up. also, what ethics is not, it's interesting to think that it amounts to kind ofo a personal moral compass where this all comes down to whether or notto e have virtuous people who acquire power and then we can allow them to shine through and we will be okay. in my view of course it is essential, but not a very interesting. of course the dropout founder of the o company exposed by the mea
9:20 pm
to have a fraud at its heart and technology that was supposed to but never worked. if it teaches us anything it's that the idea that you shouldn't lie, cheat or steal then you are in a position of power you know it happens. ethically it is on interesting. there is no argument on behalf of lying, cheating or stealing and when you have a 5-year-old in kindergarten. in ethics class at the university don't focus on personal ethics either. just as democracy is designed as a system that doesn't assume people are saints, so too should we think of a system that doesn't depend, shouldn't have to depend on people being virtuous people all the time so
9:21 pm
what then do we need it goes back to professional ethics and a set of frameworks for thinking about social and political ethics and the value trade-offs. the values are refereed in the political system and national i ethics. how it's involved in other fields inth particular biomedic, healthcare or biomedical research and scandals whether it's world war ii and experiments in the concentration camps where the united states, the tuskegee experiments that exposed a bunch of medical experimentationn we've developed
9:22 pm
an institutional footprint for ethics that goes from classes encountered as a student in medical school to the ethics committees and hospitals to what is called an institutional review board if you want to do experimentation you have to do in ethics pass through. they are seeing the development of new drugs on the market. there is no equivalent of that kind of footprint within computer science or ai. we are kind of in the early phase of the maturation in the field, and in the book we point the way forward about how to accelerate that so that computer scientists can begin to develop a much stronger professional ethic and institutional footprint is widely developed.
9:23 pm
the last question that came in and then we are going to increase. does it make sense all programs including mandatory ethics course? >> it sounds lovely but that's not enough. i got my ethics injection as a course and now i'm good to go. it has to be something embedded in the entire curriculum rather than a single class. >> thanks. first question came in pretty much before anybody said a word which is should they be broken up?
9:24 pm
if we were to break up facebook and the misinformation on the platform, probably not. it would probably exacerbate because now you would have smaller companies dealing with the same problem. the solution isn't we are going to allow some platformss to have less information and some have more. we want an ecosystem with a lot less information overall. one of the places it plays a role in greater scrutiny so when facebook wants to acquire instagram or whatsapp, they see is this actually anticompetitive and is there a way we can think about limiting the concentration of power in particular companies so they are more consumer
9:25 pm
choices. it's kind of a classic framework and all these products are free so it's thinking of the harms that we see happening and those may not be about pricing. it might be what happens in terms of concentration and power in a company. the thing that came out with microsoft was over time they were more cautious around future acquisitions or what it meant in terms of concentration in power and competing with other companies because they had seen the action taken before even though it didn't result in a breaking up of the company. it would give greater competition and less of the
9:26 pm
concentration of power we might see companies taking steps although this one is more debatable as to whether or not the products they produce are better in line with what the consumersha actually want. >> reminding us microsoft is one of the greatest companies in the world and whether or not it had any significant affect and if not, what went wrong? >> if you look at the time frame from when it happened to the kind of current day, i believe they are true you could get the emergence of others like why could google some become such a power house and why did apple take off, because i would imagine there was a lot more scrutiny in microsoft internally inin terms of being more
9:27 pm
aggressive towards competitors so that's what we saw in the aggressiveness of the marketplace regardless whether the company was broken up so i don't think that was the key it was just the change in attitude that happened as a result. >> this one is from jeremy. whyep are republican and democratic administrations stopping from acquiring small companies and what could become competitors and i want to join that with another any thoughts on how the difference in a democracy modifies your perspective on d policies so basically how should we interpret the differences between the parties were look at the system and understand why we are where we are?
9:28 pm
>> the long-running race between disruption and democracy, themo moment we are in there is a kind of deeply ahistorical aspect to silicon valley so let's look back 150 years to think about the way in which technological innovations over time transform the economy and society and the communications infrastructure in the 1890she and the 19 tens. .when you look at it in this historical perspective, you understand in a system that combines a kind of free-market orientation or democratic institutions that provide oversight and constraint you get extraordinary innovation in the private sector often with government support and investment to be clear and then all these benefits that appear and you begin to get some
9:29 pm
awareness and often that has to do with market power but sometimes other kinds of externalities that unfold. to communicate across space by a single company to play out and then the ability to use the infrastructure to maintain that control, they exercise power and influence over the political system so we have democracy struggling then we get these policy moments or windows when the government tries to solve all the problems of market concentration and harmful effects of technology the problem is then you are fixed into some new regulatory frameworkom that cannot adjust o changes that are five or seven years down the line of the
9:30 pm
future. why are republicans and democrats not taking action, at the beginning it is opening up a policy window and it began with democrats and the clinton and al gore administration and followed through successive administrations then into the obama administration butut basically big tech was harmless. he was changing things in ways that benefited everyone so people wanted to do a bear hug of o the technology companies ad to think about the potential of technology, the economic growth potential of technology and the regulatory momentum that was building in europe when you are in office and thought they are only doing that because they don't have any of their own.
9:31 pm
you can find people on the democratic and republican side who want to put a big bull's-eye on big tech. there is both sides of the aisle although they diagnosed the problem in different ways and so we find ourselves in a policy window where we see bipartisan momentum's. where can we come to agreements on how to solve it and that is what we spoke to earlier in terms of the low hanging fruit that doesn't mean the democrats aren't going to continue to bang the drum's on antitrust and continue to raisee concerns abot censorship on the platform they will continue to have a partisan element of then challenges to take advantage of the policy window that we are in to address the harms that are visible today and think about how we put ourselves in a position as a society where we don't only have moments of regulations thatnt happenen every 30 years while te harm is evident but how to
9:32 pm
reengineer the system to be much more flexible and adaptable. the implications are in the citizen and a democracy to throw out your elected officials if you don't like what they are doing. with the effects on technologyh on society you can be angry at the leadership of google or amazon or apple or microsoft and facebook. ultimately the failure to mitigate the harm of technology is a failure of the political system and elected officials so there is a need in the context of the republic to develop a view on these issues to show leadership in mitigating these
9:33 pm
harms. >> great inspirational thoughts everybody can and should take dthe responsibility. for over 30 years the governments have failed to take appropriate action on climate change and now with our back against the wall it's still insufficient. why do yougi think the governmes will be able to take appropriate action on aia and an big attack then the other that is related is the parallels between big tech and the financial markets in 2007, the thought was regulation would stifle innovation in financial services, credit default swaps, securities and collapse.
9:34 pm
climate, regulations in light of big challenges and lessons learned from the crisis why will now be different? for any of us watching was playing out on a global scale to quick responses on my part. we've been living in the united statesy and through an era of great skepticism about the capacity of the state itself to carry out projects now we let the billionaires fill the private estate of exploration. we had a great infrastructure plan but the last meaningful thing we did was the highway system or something in that neighborhood. we are at the early moments of
9:35 pm
exiting from a 30 year period of just skepticism that the state is capable of the grand projects and part of that project in the united states especially will be to rise to the challenge of tackling some of the most obvious problems that we are all aware of in the concentrated power of the big tech firms. that's a bit more of a hope thee in an empirically-based prediction. we don't have many options to hang onto. second and final observation is toer say it's a design feature t a design flaw in democracy that intends to be short-term and in a small number of years to do as
9:36 pm
jeremy said, reelect the people or throw them out and put someone t else in. they don't have the easiest incentive structure. it's one of those things it's imagine paying short-term costs that a politician will have to pay the consequences for and the gains might not be seen for ten, 20 or 30 years. part of the task is to come to terms with a longer time horizon orientation for thinking about planning and infrastructure into some of the things we talk about in the book our efforts to point to very particularpo proposals that give democratic institutions the capacity and that's where we need creative and innovative mind is at work to install that type of
9:37 pm
operating system to handle these long-term challenges. >> will it sounds like you have read it and are inspired by it in the big societal challenges with a longer horizon to come together and become more innovative to solve a problem instead of doing more mundane parts of policymaking. it's that type of thought that could reinvigorate to the achievements and innovative capacity. i'm going to pick up on another audience question. while listening to this discussion the issues are still
9:38 pm
largely framed it's in the societalti responsibility in otr words be weary of a possible collateral damage while building on technology and services but that's different than building for the purpose of achieving societal benefit in this first place. building technology for the greatest society is possible and aspirational. right now we have a lot of technology for which we've identified. the first thing we can do is to find ways to mitigate some of the harms. if we can get that done what that shows is the ability to rise to the challenge.
9:39 pm
there's so much skepticism of what can be done and there would be optimism in the one term though there is a notion of how do we build technology for good. wewe see signs of that on collee campuses and we have an organization that is healthy one of the thingss they do is look for ways in which technology can be used for societal benefit. and in some ways see what could be deployed. i think that's a great area but i don't think we should just look for that as a way to solve the current problems. at the same time in the educational system and other structures that we have people
9:40 pm
from big tech companies are helping to teach the next generation about computer science and we need to infuse more examples of the ways you can build for good and it's part of the evolution of the education process. maybe i will say a few more words about it now. politicians cover a broad array of issues and while they might have some experts that workha fr them they are stopping and if you think about the state institutions or local institutions in comparison to
9:41 pm
the federal government to educate themselves enough, that you care enough to understand what the effects of a particular policy or regulatory change might be and to take a position while forging a consensus or compromise with others with a really limited staffff at their disposal. organized corporate interest used the structure to make sure that their voices are heard. we call that lobbying. it's the payment of expert staff that show up on capitol hill or the executive branch and offer frameworks for legislation and provide facts and information to help legislators make this
9:42 pm
judgment. where are our collective interests being accounted for in the system and whether we have successful platforms without requiring the monetization of personal data or what's even possible with respect to content moderation in a world where you can't rely on the judgment but you need ai to undertake the task.le and for the lobbyists paid by the tech companies that's where the system is failing and i want to be clear that the failure of ysthe system is a failure by choice. it's not that the system couldn't have that expertise,
9:43 pm
it's that it's one example on the congressional side, the office of technology assessment would literally shut down during newt gingrich's republican revolution, the contract with america. people can get technical expertise in the market so we've omade systematic decisions in te federal government over time for the expertise and knowledge and what that means is we put our politicians in a position of primarily receiving input and advice from those f that are pad to represent there interests of the companies. i don't want people to lose sight of the fundamental task that we have which is building a democratic system that is capable of government technology before and that means thinking about the future of the civil service into thinking about what
9:44 pm
expertise we recruit for when we elect politicians and the creation into supportive independence of the institutions that represent a public interest rather than a private or commercial interest. and in our complex system which as we have said before is the best technology we've come up with to deal peaceably in a way that tries to forge consensus we need all of those elements working your central thesis is that they are making these decisions for us but isn't the decision being made by the collectives, by the free choice to adopt these things by which i think they mean technology
9:45 pm
services. >> facebook is a global platform of more than 3 billion active users and there is exactly one person who has de facto complete control over the information environment. all of us either choose to use or not use facebook so it doesn't also come down to the choices of the 3 billion that have signed on to the platform. i see the attraction and the idea. first f of all, our contention n the book is 3 billion people and one person in control is too imuch power for one person and when you reach greatest scale as the companiess have, the claims
9:46 pm
of private governance that we started with rise to the flooror and call out for scrutiny and criticism. the idea that it comes to us as individual users is of course we as individuals can decide to delete facebook and that goes back to the start off the conversation it felt like there was modest dissatisfaction with the idea of getting in the car and going somewhere you could decide not to drive and if you had other options like public transportation or rideshare services the idea that we should basically tell people you can take thee car on the road as thy are or just not drive and for
9:47 pm
what it's worth facebook would love it if it felt like all of the action ine terms of doing something or for every single person to decide what to do. what they really need is a collective pressure from malt individuals but democratic institutions and civil society organizations and that is the path forward, not describing these problems as the choices of consumers to use or not to use. >> two the personalization over relevance, how would we voice such in the future before they scale?
9:48 pm
part of the issue to think forward is understanding what canan potentially happen where e let people lose and information starts to spread. when people were beginning to call out the fact before the election happened. it's a skepticism and i think it's not that we would expect companies to figure out every negative impact but how to be weary about what the impact are and do some evaluations of what they are seeing in the system if you look at a lot of studies on
9:49 pm
the platforms whether there is bias in the ads or some sort of political bias in the kinds of information then it's done by academics, journalists. why aren't they being done by the companieses themselves. they have better access to the data so they start to do that themselves to understand what's going on where we have a function that these are the things you need to monitor to understand what's going on. those can be put in place the notion of the free-for-all those
9:50 pm
are lessons we have learned and we should put those lessons into practice for a more democratic institutions. i think that's the place we get to, learn from the past, not ignore it. there will be places companies don't get it right and to take a view of being self-critical as opposed to self-aggrandizing what is accomplished. it's not always changing the world for the better it's being more cognizant of the fact that you are creating problems as well and opening up to it. thank you for answering all the questions and for jumping back and forth. the history museum has an
9:51 pm
initiative and the idea is if you each write down one word of advice and would be decisive in a word that you would give a young person starting out in their career and once you share this word, please share a little bit briefly about why it is telling advice and i will start with you. >> the unexamined life is not worth living and the lesser known parallel the hour lived life is not worth examining. acquire and examine within and that is a pathway not just at the beginning.
9:52 pm
>> wonderful. jeremy. >> my word you need a verb and that is go deep. as a professor and someone that served as a senior level of igovernment i see them everywhe they go. people that have invested in checking every box and that present themselves as capable of learning something quickly and rapidly being deployed. i always hire people i wanted to be as a young person the person that had gone deep and the reason is because if you want to solve problems in the world you have to get beyond within the top layer of understanding. you have to go deep on the problemo and exercise your muscles in terms of thinking about the problem you have toto
9:53 pm
care about the idea as you do about the implementation. advice is to pick something that you care about and look at it from every angle and try to move the needle on the outcomes before you move onto the next thing t and seeing things from start to finish. >> the word i use is kindness and the reason why is twofold. for someone young and starting out their career i would have told them a little bit of kindness goes a long way. the dividends that it will pay youe are tangible. at the beginning of the pandemic it was interesting.
9:54 pm
maybe people who were not in a situation as fortunate as they were and that's something we can take away from the pandemic and the stress that it's created on relationships, to think about a little more kindness for other people and ourselves and it doesn't cost you anything. >> that is a great note to end on. i'm sorry we couldn't get to all of them. thank you forth your time and
9:55 pm
energy and this wonderful piece. it's clear the alignment and belief that life as we know it doesn't exist without computing. what you've done is looked at a very complicating set of issues. any technology and anything that can be deployed at scale can be weaponize to and will be. the analogy was particularly interesting for me as we think a little bit about history and its implications. it appears to the corporate veil to the liability and automobile accidents. it's associated with life as we
9:56 pm
know it and regulatory social codes and values discussed. but equally important, software codes into the algorithms that are driving things in the future. i do think many of the planes that were made in particular to the problems we are trying to solve and the types of achievable outcomes and the workforce development that's about education so i encourage all of you to read the book and share and do your best to help facilitate. i appreciate all of the author's works and wonderful moderated program. i encourage you to again contribute.
9:57 pm
thanks everyone and best wishes and be safe. along with these television companies supports c-span2 as a public service. the u.s. supreme court took up two cases that could decide the fate of roe v wade, the landmark ruling on abortion rights.
9:58 pm
talking about the complicated life and times of jane roe a woman behind the case, the activism of the decision and the impact the actions had on her and her three daughters. a mirror into this. she is a fascinating sort of testimony to the cost of adoption she struggled enormously, emotionally with what it meant. >> the family row sunday night eight eastern on c-span q-and-a. you can also listen to all of the podcasts on the c-span app. two nights event is brought to you in con


info Stream Only

Uploaded by TV Archive on