Skip to main content

tv   Rob Reich Mehran Sahami and Jeremy Weinstein System Error  CSPAN  November 23, 2021 3:04pm-4:37pm EST

3:04 pm
online. friday through sunday and save up to 30% on our latest collection of c-span sweatshirts and more. something for every c-span fan for the holidays. cap black friday deals friday through sunday at ♪♪ >> what happens when a philosopher, computer scientist and political scientist who also experienced tech firms and government come together for our lives in politics and even. values affected by technologies in ways we were just beginning to understand. together, a highly popular course at stanford university now a new book called system error where big tech went wrong and how we can reboot.
3:05 pm
ongoing work of the program today, they will explore issues around the impact of the rise of decision-making by algorithms, amount of user data collected by big tech, creasing automation and the proliferation of disinformation online and they are also going to talk about potential answers from their view to important questions such as how we can exercise agencies and reinvigorate democracy and direct digital revolution or ways to best serve collective societal interest. we are thrilled to welcome back another stanford expert. i am thrilled to introduce our speakers. first, rob, professor of political science at the ethics of society and the institute for human centered artificial intelligence 96407 words in the new book.
3:06 pm
10.24%, probability of loss came starting with our to be listed first on the cover. grandmother slipped through two pandemics. his first real job at a local jewelry store. last but not least, 13, love of his lucky number. welcome. next, introducing professor in the school of engineering and professor and associate for education and computer science department at stanford university and former full-time senior research scientist. he emigrated to u.s. from around five. he received for his first programming job. 2.2 million viewers on youtube on his methodology course. 2000 volunteer who talk 22500
3:07 pm
students during the pandemic. eight, total number of covid t vaccine taken by four members of the household. glad to have you here. next, jeremy weinstein, professor of political science, director of stanford bible studies and senior fellow at the institute for national studies and stanford institute for economic policy research at the university. one from a number of time on his social media platform. 1984, figure he moved to california. more than 250 additional students teachers when he has teaching about computer science and political science department. forty-four, president barack obama for whom he served on the national security council and chief of staff deputy to the u.s. ambassador to the united nations. fifty-one, number of san francisco giants halfway through the 2021 season. go giants. welcome. last but not least, leaving this
3:08 pm
conversation, international policy director and cyber policy center and international policy center at the institute for artificial intelligence stanford. tenures as representatives of then european parliament. two years stanford including one of the half remotely. first book he's writing. 44300 people followed on twitter and 020 the road to answer phone to refer to the city. glad to have all four of you here today. we're looking forward to conversations. >> thank you. a warm welcome to everybody joining. it is a celebration today, i want toon congratulate you on te official launch of the book, system reboot where big tech went wrong and how we can reboot it. i can't stress enough how important it is here subject
3:09 pm
diagnosis, well researched argument from americans. i say that in europe now but of course the world is looking at what united states is doing both in terms of strength out of silicon valley but also harms and ripple effects around the world. it is important to hear professors at thehe university that has so many big tech ideas, heaters, business models, technologies and disruptions. i've had the pleasure of seeing and hearing the analysis you can now read and i see it evolving over the past year end a half or so. i'm going to be comfortable calling you by your first names, colleagues and travel ban forces me to be at a distance. we were at stanford, i've had the pleasure of speaking in the popular class just mentioned and you talk together on ethics,
3:10 pm
public policy and technological change. as some of you may know, public and policy is especially close to my heart and high on my priority list so i hope we can learn more today in this conversation about how private governance may impede public values and democracy as such. and of course, how we can repair broken machinery. to start us off, i'd like to invite each of you to briefly share what you see as the main system error. i'd like to start with jeremy and then hear from rob and close this question. over to jeremy. >> thank you for the computer history museum for hosting us for this book launch event. for me, the main system error is technology currently designed in
3:11 pm
a set of values. the values are often intention with other values i care about. the challenges is technology is being designed, the people making choices about what values to optimize for are the technologists design technology the company's that provide the environment in which technology is being created but where is everyone else in this conversation? the system failure when there are a tension between value and tension between what we get from the aggregation ofet data makes possible all sorts of incredible services and tools but it comes at a cost. what we get when algorithms are used to introduce extraordinary efficiency and decision-making but raise questions about fairness and bias and due process, when we have these trade-offs, whereas the collective? the system error is our politics
3:12 pm
are not playing a central role in these value trade-offs, we've left the decisions to a small number of individuals who work in tech companies or run tech companies make decisions have implications for all of us, not only individual benefits we realize in the technology but also social effects social harm we can no longer avoid. >> let me hop in their. i'll put it in a way i don't have a policy background ortho technical background, my initial part of this as an ordinary user of this that came out of silicon valley, from my view the one that's so important to me, private governance, big tech companies located here in the valley partially perhaps in seattle have acquired something arcane to grasp over our lives
3:13 pm
as individuals and as citizens in a democratic society. the programming, the countercultural attitude of the early age of silicon valley, those programming have become the goliaths. the pandemic has reinforced this, for me at least, we are already aware of big tech 18 months ago but now our lives have become even more enmeshed in the decisions made by a small number of people inside a small number of tech company's. ways that we weren't even 18 months ago depend on a small number of platforms. our work lives, private lives and educational lives. the power vested in silicon valley amount to private governance and it's not serving us as citizens well at all.
3:14 pm
>> from the view of how we think about the systems built. oftentimeste technology systems are built with particular metrics and mindset are quantifiable and at the motion we want to scale. part of the issue is if we have those that are easily quantifiable, it doesn't mean it matches the things we want. you get the scale, sometimes what happens is what you are measuring about things you actually want is more and more. you may want people pleasure from using something but the way you measure, it's not really whether or not there benefit from, assist measure. scale,u take that at there's a greater difference between the things we want in the things that are the outcome. you have people who built
3:15 pm
systems and think about what we measure, it's not just about our systems, understanding from a broader picture we need to understand more about the societal impacts we build machine learning models, the impact they have on people rather than the rates we can measure and those are the things we need to bring in discussion as the technology deployed further. >> it's a picture of the ankles you take and what makes the culmination of those that are so special about the book. early on you portray students that developed a disruptive start up but you see his attitude representing a trend and silicon valley are going to quote a small piece. you live in a world where it normal, don't think twice about how technology companies create harmful effects.ea
3:16 pm
i want to ask more about a critical question, to what extent has stanford university for each of you have a long history in working with students, to what extent has stanford actively and successfully created a culture where disruption is admired giving access to the companies in silicon valley for students but also inviting po's and other executives in teach. i would love to hear reflection from each of you on how stanford plays a role in that disruptive ecosystem starting with rob and then concluding with jeremy. >> absolutely. there's a significant responsibility for the problems we see in big tech and for about cultural orientation to aspire toward disruption to look for
3:17 pm
ways to coding your pajamas and roll your project out quickly and see what happens. throw spaghetti on the wall and figure out any problems downstream. we've seen personally in teaching students because in the past five years the kind of professionalization but even young students come to stanford with, engaging with capitalists who populate campus and find ways to leverage their ideas in start up, and my office hours i had a first order 18-year-old kid who's taken political philosophy class, i was making small talk with him, what do you think your major and? he said definitely a computer science major and i said that's interesting great. he said i have start up ideas
3:18 pm
already making small talk and ngcontinued on, what is your id? he said, he looked at me earnestly in the eye and said to tell you, i'd have to ask you to sign a nondisclosure agreement. that kind of attitude of reflex the celebration stanford has had for a long time of the disruptive econ. i wrap up by just saying from my view, it gives stanford special responsibility to respond to the things it participated in creating. not only is the product developed and brought to scale as a company but this cultural orientation of present here. rather than just heroes typically white guys of the founders of companies and whose names everyone knows, we need an
3:19 pm
alternative the passage from the beginning of theve book -- witha short profile of another person who matching himself as a civic technologist, we need more to be aspirational figures for young students at stanford technologist at >> i'm sure there are people in the audience who are members, 20 years ago there was a drop in the number of computer science majors. there are worries to whether or not computer science continues to grow because there was a decline in the number of students so there were efforts to try to increase the number of people going into technology and i am happy to say the efforts
3:20 pm
proudly succeeded computer science majors, about 350% in the last 15 years, the largest major on campus for men and women, one in five students. part of the transformation talking about the world changing technology, how to make an impact on big problems through computing. i think that is still true but we have also open up to in the last five or six years more clearly is one of the real social impacts of technologies at scale? what does it mean for values we care about privacy and personal autonomy? as they come into play because stanford is an educational institution, to integrate that into the educationalt process fr our students so we can balance both the power of computing, it's a powerful platform social responsibility that comes with
3:21 pm
the things we do when we both out the large platforms so that is the place where stanford clearly has a responsibility on thesi developing technologist wo have built these now we are seeing the consequences but at the same timeo understanding tht and charting a better path forward to teach both the next-generation and the work we do with people in the industry to think more about the social consequences and how to mitigate it. >> something stanford is a microcosm of what's happening more broadly technology, it has played a role in the ecosystem that is silicon e valley and ths period of tremendous optimism to extraordinary heights in the institution. we've seen afflicted in every aspect but in that wave of optimism confronting a moment of backlash, we face back dynamic
3:22 pm
on campus, a lack of trust in the tech companies and majors about whether they want to be associated with these arms so it puts us in a position as educators on campus where we have to think and dig deep with our students around the questions of how we amplify the benefits of technology while mitigating the evidence so there is this nuanced pragmatic conversation that not optimism but recognizes technology itself involves trade-offs and generates benefits challenge collectively is to think about how to weigh those things. but i could say on this is while we focus on stanford and think about potential roles or talking about tech companies and focus
3:23 pm
on corporate leaders, the big picture argument of the book is that individualizing challenges misses the bigger challenge we face which is the system error we are describing is a function of optimization mindset embedded in computer science and technology, it basically ignores the values that need to be refereed as projects are denied. it is embedded in the structure of the capitol industry driving the growth of silicon valley and face companies that prioritize the scale before we even understand anything about the impacts of technology in society. it reflects the path paved to market dominance by market in retreat from exercising oversight. so stanford has an important role to a play and there are
3:24 pm
individuals who may look inside it didn't pay attention to the consequences of the moment of tremendous optimism. the bottom line is there are systemic challenges that need addressed stanford needs to be at the center of addressing those challenges but stanford can't do alone. >> thanks, that makes a lot of sense and i recognize even if i've only been at stanford about two years, the change in the questions people are asking, the awareness after january 6 storming of the u.s.g capitol after disinformation around covid, people really thought problems are not just in other communities, they can truly hit home. the harms are real, not virtual so too say that we see how hard it is for students to find jobs outside of the big tech companies that offer great salaries when people have real
3:25 pm
student loans to deal withea soe could maybe touch on that a little later. for now, i want to focus on the big question of governance because i see government as one of the main themes of the book. on the one hand youma have many anecdotes and stories how tech companies evolved and technology itself standards, injecting capitol dollars in a different direction but you are quite critical about the responsibility on the part of political and democratic leaders. you write and i quote from the virtues of democracy are difficult to mesh with harsh realities that appear again and again when it comes to governing technology. the technical ignorance of politicians who are hardly critical providing oversight contemplating regulation.
3:26 pm
profound disagreements what we value andnd how trade-offs shoud be made. on the other hand, your book is a call to action and i think also one that has laws that should be updated, the galatians about the four change so i think you hope to reach lawmakers in d.c. and overseas perhaps so they can jump to action. my question is, what do you think to expect from the democratic elected officials in the united states? how can there be rebalancing and is it feasible to have rebalancing of technology among politicians but also democracy knowledge among engineers? i would like you to focus on one priority, one thingn you would change if you had a magic wand when it comes to policy i would like to start with rob and close
3:27 pm
with jeremy. >> that's a great question. if i had a magic wand, the thing i would focus on his moving the agency, it's all about individual choice, what is collectively possible. what i mean is oftentimes the people and cars and safe drive safely and if you are worried
3:28 pm
about driving safely and don't drive, that would be a ridiculous choice because there is value in driving so what we do? we create an infrastructure, traffic lanes and speed bumps guardrails for safety. within that, you can choose to drive or not and how you drive safely but we have a system that makes it safer for everyone going from individual choice to think about how we have a larger system to provide those guardrails. >> i am tempted to say something specific about what is on the possible horizon in d.c. or elsewhere in respect to the privacym or the physical of philosophical route and wave my magic wand to
3:29 pm
try to show this optimization mindset, when that mindset becomes an outlook about life as a whole, when our colleagues sometimes tell students, everything in life is an optimization problem. when you look at the institution failing to optimize the things we care about complain about the inefficiencies and lack of intelligence for elected leaders and democratic institutions. democracy is not an institutional design in the first place in my view to optimize, it's met as a way of referee conflicting preferences of citizens in a way that possibly is adaptable to new challenges and a new dayta so wn i use my magic wand, it's to keep the optimization mind in its place as a technical
3:30 pm
orientation but not an outlook on life. what's ultimately at stake is whether or not the big tech companies and technologists worked there care about the integrity of our democracy and maintaining institutions here and abroad. ... >> in the tremendous paralysis and hard not to look in the polarization in the united states and feel like absolutely nothing is possible. but one of the places that we started the book is by trying to focus people's attention on the alternative to reenergizing our izdemocracy. in the alternative too read our
3:31 pm
democracy is how technology unfolds and how it impacts society and living those decisions to the technologist into the company's t billing it i think we have a pretty good preview from the lastt decade ad a half or two decades, what the consequences are of leaving the decisions in this case, to the experts of the techno- crowd and so when you hear from corporate leaders or investors in silicon valley, and anti- regulation push and view that regulation is going to slow down and it ended innovation and relational get in the way of some sort of progress that we need in our economy and we want you to understand that pushes essentially a rejection of the role of our political institution which are the technologies that weeks selected as a society to help us referee critical trade-offs that exist. no right to say no" to this respect that democracy is not up
3:32 pm
to the - not only because of the lack of technical knowledge that you pointed to but because of the institutional features that make progress very difficult to achieve and so everyone's reasonable expectation about what legislation is like to come out of congress is nuts, that that is prediction that you can make in the system at the current moment now washington with a base about the future of antitrust and about content moderation in the communications act but oneen of the places that we start and this was excerpts within the atlantic was published a couple of days ago is can we find a set of areas for legislation action for democracy can do what it is best at which is achieved consensus from the most evident arms that need to be avoided in the near-term because the agreement among both parties and the constituents is the parties of
3:33 pm
the changes that are required. we identify three areas that we think are the areas of agreement currently in one is around data privacy and there's asymmetry of power that exist between the companies that access our personal data and individuals have to navigate an outdated regulatory model. in the second is the use of algorithms and the decision-making settings whether it's credit or medical services are in our criminal justice system, the lack of transparency and process and fairness, the constant considerations that are being and brought of the rollout of the systems that all of us should be concerned about. in third is the effects of ai on the workforce and in the near-term and it makes the effects will be mostly resulting from a lower wage workers that we need to think about our social safety net and its evolution with respect to job
3:34 pm
replacements and in reality over the medium term, the transition of ai powered economy will affect everyone's jobs and you say this to the computer programmers and our labs in the ai tools that you want to be rethinking their jobs in the future and we need an orientation from the federal o government is space that really recognizes how do we prepare people for this workforce of the future that is not to say that we don't think that antitrust is absolutely essential are content moderation are essential we going to induct on those issues in the book predict but if you're asking me what is achievable in our current political moment, i think we need to avoid the need to boil the ocean anden try to solve evy problem with sortnd of a sweepig piece of legislation on the future of text and it to be god those areas immediately to show progress is possible democracy. >> and thank you and as someone with a couple of scars from
3:35 pm
having served in public office, it sometimes it's something to defend of right and i mentioned the officials when it completely recognize the paralysis in dc and sometimes wonder if that is because the regulation of western tech applications came too late and had it exacerbated the decisions and the entrenchment week currently see certainly in the united states. i think about this momentum that you shared his building is also more universally true. usually they responded to changing realities and when you look around any representation with the city council or the u.s. congress or during prayer and inure to promise for, the first few experts and the executives often point as a problem, whereas the other, you ask the set of farmers how
3:36 pm
up-to-date the knowledge is agriculture and turf they will not at you and we also had to find ways to work the general in this variety of expertise to say that often existent representations. in a democracy rather than expecting that all democrats will run for office they come soon although it would be in a solution i think you some of the problems in any way, sorry for the distraction. i want to go from asking each of you a specific question and then we have quite a few from the audience pouring again so i think that we can start with and all of you are pondering on your questions, feel free to drop them in the t q&a function. and i'm collecting them as we go. so the first question built on what you just said to jeremy
3:37 pm
because you seen from your best job as representing the u.s. and ministration in the obama administration of the united patients now technology and big tech are affecting not just the market democracy to the emerging economies the role of the world's most vulnerable. would you like to see the american leadership on the global scope when it comes to dealing with these topics and can you speak to why matters globally what happens in the united states. >> thank you so much the question and i appreciate your question, their job is to navigate a complex array of issues that represent the constituencies that sentiment office we need to think about
3:38 pm
how we design our political institutions to ensure the expertise is available to them. and not only to a lobbyist who has a particular point of view and i think that's were falling short in our democratic institution. with a set of elected representatives and senior electives without inability to have expertise to speak to issues of the public interest rather than the corporate interest in their dependent on the technicalexpertise that come from high paid lobbyist and that is not unique to the united states but maybe as it exempts prorated in the united states and we need best to a address it and i really appreciate the question the challenges us to put this in a global perspective and may be the first and most important thing to say is not harmed that technology is causing whether intended or unintended or not just felt by citizens in the arising some of the felt all
3:39 pm
around the world because u.s. cabins who have given birth to powerful tech companies whose choices are shaping the lives of billions of people around the world so the absence of leadership and thinking about how to amplify the benefits and mitigate harms, asthma patients not just for americans, we talk about this information and misinformation with respect to the market elections but of course this is played out and other electrical hundred electoral anything about the use of encrypted summer by vigilante groups in the promotion of hate speech and incitement on social media platforms and other countries in the world and of course impactof of ai on jobs in the future of work is not uniquely american challenge either. and it is a one of the reasons the united nations exists, but
3:40 pm
there's view after world war ii that needed to create the form in the world where everyone could be heard and had a voice. maybe the gap between the reality of the current moment in the future as some of us might want to envision is the fact that most people have no say whatsoever in the way that technology is affecting the lives. so if you're are living in south asia or south africa or latin america, these places were because the tech companies are not headquartered there, the government are the mercy of the providers of these products. the paved the way forav these companies to operate you as a citizen and even your elected officials do not have a voice predict i'm notoi suggesting tht some is going to make a technology could ultimately for the governing technologies thiss case begins with the actor at
3:41 pm
the table, united states where these companies are headquartered. and the most important steps that needs to happen is that instead of seeing this primarily through the lens of competition with china ish a frame using often is brought to the table, he looked the companies do with what otherwise we will losing race for innovation with china. darn the real and ultimately we have a set of democratic government ensure the values and care about things like justice and fairness involving the care about privacy and autonomy. be these governments need to working together. rather than working in opposition to one another but ultimately that ise the challene that the biden and ministration conference but we should not pretend it to simple problem to solve. the distance between the european union and usa's between tech regulations right now may
3:42 pm
be as wide as the distance with non- democratic regimes around the world that we have not have consensus around what the problem is and how to regulate braided we need to move forward with transparent conversations about the values at stake and we need to recognize trace the need tot be made legislatively by the executive branch need to be coordinated with others need to create space to understand the harms there not only felt by americans, about people all around the world. >> absolutely and well said. i hear this popular phrase i'm hearing it more and more, human centers where we talk about technology as it often as being a kind of a reinsurance about how ai and other technologies can actually be kept in check in
3:43 pm
humans should have the last word of the safe or oversight. what are your thoughts about human centered ai and the role of people in these increasingly automated processes and how do we avoid nuts or this just pr or catchy slogan that loses meaning. >> i thank you so a real worry. in one of the important things to keep in mind sometimes it is interpreted as what we have the human and the look for decision-making i thank you so more fundamental friend been ai is thinking about the process in which we think about people participating in what is ai an understanding what the actual impact in the downstream problems that may generate needs medicated so get back to the question of who controls the
3:44 pm
technology, in the hands of a few people or is that an opportunity to broader participation thinking about this i think that is the real heart and thinking about this morning modeled. an air raid is not a near rate, human lives are being h affected so when you pick a mistake about whether or not someone remains in jail the criminal justice system, that is not your error rates, estimates like you disrupted and so unless there's a real understanding of that, and were not being humans, were being some kind of algorithmically center because were thinking about what can the algorithms do in the technology do with the real impact on people i think there's a good example for one of the things that we try to take this balanced approach in the book is not just big tech bad or good, but the same company that you see places where the technology
3:45 pm
is being caught up in a human centric way in places where it is not so give you one example predict had built those systems for the resumes andpa hiring in a large company such a high calling process so by building a model of people that hard before, can they screen the resumes if this person had a high likelihood anddi be hired that they should probably interview them in with the pandas the model had a gender biases would actually downright resumes that included the wordwo woman it and names of all women colleges on there and so, one of the most technological nobodies on the planet. then look at this and they tried to mitigate the general bias in the model and ultimately they concluded that could not do it. so they scrapped the model that example of human centered ai in the understand what the impact are naturally ssl before they
3:46 pm
deployed it to scale had a bunch of gender bias decisions actually being made and ultimately decide that if you cannot fix it, should not deploy it braided because his error ratesti are real or affecting rl people's lives so i think there are examples of these are the kinds of things that we want to hold up and have a tech sector s something for work on these issues but we need to be broader than that money get back to the point about we should not just rely on someone doing this. we think about the fact there's actually some guidelines around how the modelev should be evaluated and what kinds of things we should generally measure and then to not only provide a mechanism for policymakers or collective decision-making to impact what is happening in technology and provided going to actually help educate technologists about what's important about human centered technology and in the neck started coming along to build these pieces of technology
3:47 pm
would be great for them to know that here are the things you need to watch a fork and some examples of where things have gone wrong in the past and how people actually mitigated them. i think the. learning is actualy by direction. >> thank you and when i hear you including people being represented to think about how the technology will work in their communities or in societyo are questions around testing the impact first read i wonder if you foresee onboarding or bringing it in house some of those otherwise more democratic processes and representation in including the voices in decision making andak policymaking or the trend that we see with facebook developing an oversight board in this sort of process that mimics checks and balances but is still under the roof of a company. and i'm just curious how you see that and you foresee companies
3:48 pm
that the try to inboard some of these governments more formally or do we need to also create a ball or distance between what companies decide what type of oversight is applied for example. >> i think both things need to happen one in the company in terms of this design and by thao i don't mean just sort of do some field work to see how i built this viable product and people what features they use and what usually don't actually try to understand how are they harmed were the benefits and where are the trade-offs or the populations are being served and not being served. he is one of the things it also happens in terms ofd the imprint of who is in the tech sector, that affects who is building the product and for who they are building the product printed so some people just get excluded xcfrom that and if we take a
3:49 pm
broader look at how we bring more people into the field and whether or not the technologist or to talk about what the real farms are and that is a place where we can take the better action so we can get better technology and i don't see a separation with governments in the says that governments is over here kind of affecting policies and i do think that there has to be things between the political sector and the technological illusion to come up with solutions that work for everyone. but sometimes we are in the technology, is about self-regulation we know what were going to do and were going to debate and lets us do what we want to do or can do because we will come up with good answers. i think it's interesting but i if youay that's great come up with a set of policies that you think are reasonable for self regulation, why wouldn't you want that codified his actual relation why wouldn't you want additional people
3:50 pm
actually looking at the process and say this is something that makes sense and level the playing field across the entire industry some executives of the company change and they have different ideas about what they want to do, we have some of the rules for everyone. so think of the notion of self-regulation versus government regulation as being independent are kind of a red herring and the two can actually work together but his understanding that you need that collective input into the process to get thehe outcomes tt actually work for everyone. >> product and one thing pretty we began about four years ago engaging undergraduates and we saw technologist and needing to engage in a way that we are talking about today in the of the technology now is designed as normative value and ultimately how it affects society that we saw this sort of
3:51 pm
on the demand among the professionals in the sector but also a conversation. so he began to teach in the evening's and creating space in the tech companies to be having these kinds of conversations was talking to us, the extent to which they were not having those conversations and so deeply disempowered and their own companies that they were part of in fact, we often hear what happens when you're thinking about sort of selling a product into a new market and you concerned about unintended consequences are you have to make a decision around who should benefit and should be harmed in thehe design of a particular algorithm and you need toul apply some various things but a you: who makes the decisions. and the answer is like we call the offices because ethics is a compliance function in most companies right now and what is allowed by the law and once we
3:52 pm
know what's allowed by thes law, we understand our freedom to run the move. i think that is no longer that we can have an ethics is a compliance function because we know the limits of what her political system is going to be able to provide in terms of the guard grout hundred rails the guidance and i think that the book is an argument that alongside the change that needs to happen in washington, is change the needs to happen in field of computer science in terms of cultivating an ethic of responsibility, and in the structure of business and iru think were already seeing a preview in the structure of business about what this pressure looks like on the workers and the talent and the companies need to address we are saying that it's not okay if you sell this product to anybody i'm not going to want to dedicate my time and energy where i'm concerned about the way in which this algorithm harms some populations and not others or i am deeply concerned about the impact from the help of our
3:53 pm
democracy of the choices that we are making with respect to political ads the content moderation o and so that push ad pull in that give-and-take is the moment we are at and there's a role for everyone in that conversation no one should walk away from ourod discussion today thinking that where the few that somehow regulations going to come in and solve this problem. ultimately the big argument of the book is that the agency is for everyone and for you if you're technologist, to ask questions and to challenge your colleagues around potential harms there's agency if you lead a company to think about what your bottom line and what is the metrics that you're optimizing gifts and coursers agencies or citizens and politicians as well. >> thank you that's right position and preview also forhe the question that i wanted to ask rob who of course is an ethics expert and we do often appear as you also reference jeremy did, the notion of an
3:54 pm
ethics being a solution and appointing an ethics officer or an ethics counsel and adopting a ai ethics principles and so on and so forth. so he wanted to ask rob let you see as the pitfalls of how ethics being invoked and used and as part of the whole discussion around problems that the big tech presents to us. >> i want to first say two things that they are not at least should not be and then one or two things about how the book in the classes that we teach, we tryra to frame a set of questios about ethics and personal, this is just as jeremy said we don't want ethics to be that you asked the general counsel and is just a compliance function and in that spirit of the company appointing a chief ethics officer he don't and if you
3:55 pm
could just fill responsibility of your own in the company as long as you running up the chains of the ethics officer. and then they can decipher everybody we needed collective ethical orientation as jeremy said for going or if you wish, push it all the way down to the design system of companies product development line. and to ensure that everyone has the permission to raise ethical questions and has a framework for thinking about the kinds of concerns come up. and also would ethics is not for at least not very interesting to think that ethics amounts to having a personal moral compass or this all comes down to whether or not we have virtuous people who apply the power then we can allow us as human beings to shine through and then will be okay. personally, i think in my view, of course is essential but it's
3:56 pm
not very interesting, elizabeth holmes is going on trial in san jose and she is of course dropout founder of the company exposed byed the media. to have initially a fraud, a technology that was supposed to be able to test fraud in a simple way but it never worked. you should not lie cheat or steal your neck position of power, we know what happens like is no argument about the path, you probably learned that when you are five -year-old working in partnership you don't need it ethics class at the university to tell you to not lie cheat or steal so don't focus on personal ethics either. just as democracy is designed it
3:57 pm
in the system but does not consume people, so too should we think of a tech ecosystem that does not depend or should not have to depend on people being virtuous people all of the time. so what then do we need, goes back to this idea of professional ethics and at the center of framework for thinking about social and political ethics of these ideas of value trade-off. and hereby this political ethics is the ideas of the values that are refereed by our political system and say one thing about the professional ethics pretty we look to the book, how does the professional ethics have evolved, in particular in biomedical fields healthcare provisions, pharmaceutical research predict and partly response to a bunch of scandals whether it be world war ii, experiments on the jews in the
3:58 pm
concentration camps, or the united states and the experiments that expose much of medical experimentation it on blacks with respect to syphilis. we now developed a deep institutional footprint of medical ethics that goes from classes that you encounter as a student in medical school to ethics committees in the hospitals to what is called an institute for review board. you have to an ethics pass through about if you ever do - and a pastor overseeing the development and really the development of new drugs on the market and there is no equivalent, that type of professional ethics footprint within the computer science orin ai. and we are in a kind of an early stage of the nitration of the field and in the book, we pointed the way forward to a
3:59 pm
number of how to accelerate that so that computer scientists begin to develop a much stronger professional ethic that is an institutional footprint is widely developed as ited is in biomedicine. >> thank you not looking at the last question adjustment and i will ask it because it goes well with what you just said and there's a ton of questions so i want to make sure we get them so this one is for you rob and it does make sense that all computer scientists include a mandatory ethics course would. >> all that sounds a lot like, ethics is not a spot, my ethics injunction as of course now i'm good to go. it has to be something is embedded in the entire curriculum rather than a single class. >> thank you the first question came q in remit before anybody said a word, to the companies be
4:00 pm
broken up. >> i don't think breaking of the companies is actually a solution, if you think about some of the problems that you want to try to solve, so let me give you a simple example. if we were example to bring a facebook and suddenly there's this misinformation on the platform, probably not, it would probably make a bunch of smaller companies that were trying to all deal with the same problem and you know, the solution is not void allow some platforms to have more disinformation and some to have less disinformation but we really want to have an ecosystem a lot less misinformation overall now at the same time, there's a question around competition so i think one ofti the places that antitrust plays an role is in terms of greater scrutiny. so when facebook wants to inquire and instagram or whatsapp, it would be great under greater scrutiny site is
4:01 pm
this actually anticompetitive we can thinka way about limiting the concentration of power in particular company. so there are more consumer choices and other thing about the consumer choices the classic framework for antitrust does not work about the consumer pricing and all of these - we need to f take a different framework which is really thinking about what the harms that we want and we see happening and they may not be about pricing, theyy may be about things like what happens to the information ecosystem and what happens in the concentration of power in the company. and i think that you probably know this but under much better than i would be in the thing that came outou of the anti- trt problem with microsoft, it was allve broken up was just over time, they were more cautious around futureio acquisitions tht they had our weather meant in terms of concentration of power in terms of competing with other companies because they had seen the antitrust fashion even though it did not result in
4:02 pm
substantial picking up of the company. so i think that if we just show there's that greater scrutiny, we will get greater competition and less of the concentration of power and we might also see companies taken steps. although this one is more debatable as to whether or not the products they produce are better aligned with the consumers. >> thank you and i have a question and those in years which is reminding us that microsoft is still one of the biggest companies in the world and the question to whether the antitrust against microsoft had any significant effect of its market power andnd if not, what went wrong. >> we thought that without is if you look at the timeframe from when the antitrust action happened against microsoft, the kind of current day, one could argue and people made this argument around this and i actually believe they are true and you could get the emergence
4:03 pm
of other companies buried right why did google become such a powerhouse snapple really take off and because their words probably i would imagine a lot more scrutiny and microsoft internally in terms of the composition in the marketplace and what kind of stance i took in terms of being more aggressive towards competitors so i think that's what we actually saw this kind of a blunting of the aggressiveness in the market place regardless of whether or not the company was broken up and i don't think that was a key, is just a change of attitude that happened. >> thank you and this one is for jeremy, where both republican and democratic administrations from acquiring small companies and what could be, then competitors avoided 20 with another question which is any thoughts on how the difference in the u.s. being a republican or republic, not a democracy modifies your respective on the policy so basically, how should
4:04 pm
we interpret the differences between the parties or how should we look at the system and understand why we are where we are. >> so one of the things we do in the book is pretty what we called the longg running race between disruption and democracy and the moment that we are in, kind of a deeply aided historical outside in silicon valley always, so this book under the back 150 years old and think about the way in which technological innovations over time transform our economy and society we go back to the telegraph and we think about the origins of the telecommunication interstate telecommunications infrastructure in the 1890s and the 19 hundreds and also into 1910. and when you look at this in a historical perspective, you understand it in a system that combines kind of a free market orientation with democratic institutions that provide oversight pretty and you get
4:05 pm
extraordinary innovations growth in the private sector and often with government support investments to be clear. then you get all of these benefits in society. and then you begin to get some awareness of the harms and often those harms have to do with market power but to enfold braided so the dominance of example is telegraph system, and given the need to basically rain down extensive infrastructure basically produces something that enables us to communicate across a single company that is in control the same thing plays out with railroads the ability to use that infrastructure to maintain that control. how did the maintain it well they exercise power over all political system and so you have democracies struggling to keep up with disruption in the new get these policy moments of these windows went through the government to try to solve all of the problems with the market concentrations in the harmful
4:06 pm
effects of technology the problem is that then your fixed into some new regulatory framework than five or seven years downr the line in the future. where we now and why republicans and democrats not taking action on big tech, just at the beginning of an opening of a policy. so itth began with the rest and the administration and followed through the republican administration and then into the obama administration basically big tech was pummeled in fact better, he was amazing in ways that benefited everyone so people there part of the technology companies and they wanted to have google on one side and apple on the other side and think about the potential of technology, and the economic growth potential of technology, and who looked at the regulatory momentum that was building in
4:07 pm
europe when an office they're only doing that because they don't have tech companies of their own. and we are in a new moment now and you can find people in the democratic side in the republicans and the underside he wants to put a hold on to big tech, both sides of the aisle although they diagnosed the problem in slightly different ways we find ourselves even in a policy window where you often see bipartisan momentum to tackle the problems with the question is where can we come tt an agreement on the problems being resolved and how we should solve it and what are those low hanging fruits. and they won't continue to bang the drum's and the republicans are not going to continue to raise concerns about censorship on the platform and you will continue to have a partisan element with the challenge in the current moment is to take advantage of the policy of the
4:08 pm
moment working to address the harms their visible today but also to think about how we put ourselves in a position as a saw some signs society we generally have moments of regulation every 30ha years were all of the harms are evident been had we actually reengineer our democratic systems to be much more flexible and adaptable. i think the one comment that i would ask his were republican not direct democracy what are the implications and they are this read in the power that is invested in the hands of citizens in a democracy, is to throw out your elected officials if you don't like what they are doing and part of the story in the book is that if you are unhappy with the effects of technology on society, yes you can feel angry of the leadership of google or apple or microsoft and facebook, but not only that because ultimately, the failure to mitigate the harms of technology, is a failure of our political system and our elected
4:09 pm
officials so there is a need for all of us in the context of the republic to get informed and to develop these issues and hold the politicians accountable enjoying leadership. they need to mitigate these harms. >> they keep that inspirational thoughts that everybody can and should take their role in response ability andnd this ones for rob, there are quite a few parallels being drawn with past lessons learned. for over 30 years governments have failed to take action on climate change and now with her back against the walls, and the action is still insufficient and given that, what he think governments will be able to take appropriate action on ai and big tech in the other question which i thank you so related is that there parallels between big tech today in the financial markets in 2007, and regulation in
4:10 pm
financial services but the result was default mortgage-backed securities and eventually markets collapsed read is negative equivalent a class coming. and climate regulation over the lack of action and i do big challenges and lessons learned from the banking crisis and why will now be different. >> these are important questions and hardly unfamiliar concerns for any of us with what is going on the global scale of the climate in the big tech in a ogreat variety of other thingst braided soap to quick responses on my part, number one is we have been living in the united states but moree broadly many democratic societies through an era of skepticism about the capacity of the state itself to carry out these project and we used to go to the moon and now
4:11 pm
we let that billionaires build the private space exploration it and we wanted a great infrastructure plan for the last thing we did in the united states was a highway in the neighborhood. i think that we are possibly the early moment of exiting from a 30 year period of skepticism that the state is capable of grant projects and part of that grand project in the united states especially will be toha rise to the challenge of tackling some of the most obvious problems that we are all aware of, climate change, the concentrated power of the big tech firm. perhaps that is more of a hope and a prediction but i would say that we don't really have many other options to hang onto like this is where we as ordinary people can investor time and invest our aspirations in the second and final observation here is to say, this is a design
4:12 pm
feature, not a design flaw of democracy and that it tends to be short-term and we look to the next election in a small number of years to do what jeremy said, either like the people for what they have done or to throw the whole system out a democracy does this cause was they don't have the easiest incentive structure for our elected officials to think about a longer time in the climate is particular is one of those things where it is easy to imagine saying short-term cost than aav politician will have to pay the consequences for this next election in two years of the gains might not be seen for ten or 20 or 30 years and so part of this democratic institutions now, is to try to come to terms with a vulgar time horizon orientation and for thinking about the planning or thinking about infrastructureut and some of the things we talked about in the book are efforts that pointer to very dealer
4:13 pm
proposals they give democratic institution capacity to think long rather than always be reactive and is where i think we need to be creative and innovative minds work to install that type of operating system within democracy. so that it can handle these long-term challenges. >> and coincidently a speech yesterday and it almost sounds like you read it and you're inspired by it and if not, i would recommend you readul it because he's trying to do exactly that. big societal challenges as a mission with a longer horizon but it excites your under bureaucracy and the leaders and people in different political aisle sort of come together andm more innovative to solve a problem instead of doing more mundane parts of policymaking anyway. september thought they could reinvigorate democracy first
4:14 pm
greater achievements and innovative and i'm going to pick up on another audience question. i'm stuck whilele listening to this discussion that the issues are still largely framed and is still in terms of societal responsibility and consequent as it and side effects in other words, this collateral don't under damage to building out technology services but that's different than building for the purpose of achieving societal benefits in the first place and can we envision a society where that is possible. >> that is a great question and i think that building and technology for a great society as possible and its aspirational. and so right now we have a lot of technology for which we have identified harms and so the first thing that we can do and what we need to do because isg more pressing is to actually find ways to mitigate some of
4:15 pm
those arms and his jeremy alluded to, there's lots of low hanging fruit in that area and we can get that done, without shows is the ability of democracy to rise i feel the same way a lot of times i'm watching the news are just so much skepticism but actually can be done possibly can understand it actually are things we can do now, and bill's optimism in the long term though, do think there's a notion of how we build technology for good. if you see signs of that on college campuses now, we have an organization healthy and vibrant and stanford called - was social good and belong with the students look for ways in which technology can be used for societal benefits predict the do things like thinking about what kind technology the date for the help building the prototypes for them to show them what is possible and then in some cases following through to see what could be done.
4:16 pm
i think it is a great area but i don't think that we should just look for that as the latest problems i think there's pressing problems now that we have to address and at the same time, and our educational systems and other structures that we have, is also lots of ways that technology companies have gotten involved with things like programs like fields where the work will people from big tech companies are helping to teach the next generation at about science what we need to infuse in their is more exampley of the ways you can build for good and not just part of the evolution of the education process. >> thank you so much in the questions keep coming in and this one is for jeremy rated with politics and obvious will our democracy be able to have this complexity. >> so party to this challenge earlier, maybe i'll say a few more words about it now politicians are generalists need to cover a broad array of issues
4:17 pm
and while they might have some expert staff that work for them, the staffing and government is thinking is even thinner as you think about the institutions or local institutions in comparison to the federal government. so without our politicians, those who represents our interestnm in our men to do the hard work of refereeing the different things that we might value in defining the ways forward in our society, they do the impossible which is to educate themselves enough to understand what you care abouthe and understand what the effects and of the particular inventory change might be to take a position while a consensus or a compromise with others with a really limited staff at their disposal. organize corporate interests have used the structure of this democracy to make sure their voices are heard and they do an extraordinary job of that. we call that lobbying and is the
4:18 pm
payment of expert staff to show up on capitol hill they show up in the executive branch and offer frameworks for legislation and provide facts and information for the to make this decision. and where are our collective interests being accounted for in the system when you think about complex domains, that involve and require technical knowledge and expertise,ec for example issues of bias no rhythms and thinking about the structure of platforms and the business models of the platforms whether we can act successful platforms without requiring the modernization of personal data or what is even possible with respect to content moderation in a world where you cannot rely and human judgment do you need ai to undertake that task. who is providing it different set of use for the politicians in a counter or challenge the view that is offered up by the
4:19 pm
lobbyist paid by the tech companies and thus were system has failed and i want to be clear that the failure of our system, and the failure by choice, a thinner could not have the expertise, because the expertise is just one example institution on the congressional side called the office of technology, but literally shut down during the new gingrich's, contract with america, the he said we don't need technical expertise in congress, people can just get technical expertise in the market. so t we have made systematic decisions in the federal cl government over time to erode our technical expertise and knowledge what that means is we put our politicians in a position that primarily receiving input and advice for those who are paid to represent the interest of the company so while a lot of us are focused on the ai regulations or upscaling the future of work, but i want people to cite the fundamental
4:20 pm
the happiest building a democratic system, that is capable of governing technology. reporter: governance - enemies thinking about the future of the federal service and thinking about what expertise we recruit for when we elect the politicians enemies thinking about the creation and the support ofhe independent institutions that represent the public interest rather than a private or commercial interest. inr our complex system, which is present before his best technology we can come up with two deal peacefully with no way that tries to have a consensus or an agreement among these conflicting views, we need all of those elements working rated we have under invested in them up to this point. >> another similar question, your a small powerful group company they are making these decisions for us but isn't the decision is actually being made
4:21 pm
by the collectives, by the free choice to adopt these things by which i think you mean, technologies and services. >> short well particular, facebook is a global spot under platform more than 3 billion active users and there is exactly one person who has complete control over the information environment on facebook and that is mark zuckerberg it. the question is asking us to consider that all of us either choose to use or not use facebook so that doesn't also come down to the choices of the 3 billion of us who sign up to the and uses service. i see the attraction of the idea they are but i think that it presents us with a false choice and first of all, are contingent in the book is that 3 billion
4:22 pm
people and one person in control, is too much power for one person or one company and when you reach great scale as these big tote to companies have, the kinds of private governance that we started with, rise to the floor and they call out for scrutiny and criticism and then the idea that it all comes down to us as individual users is that we think in the space as and is we as individual we can as individuals delete facebook, but that goes back to the idea that it started the conversation, namely that if you didn't like cars that were available on the roadways to you, you would still like there's a modest satisfaction with the idea the boys were in the park and we can decide just enough to drive and he and other
4:23 pm
options perhaps like public transportation or rideshare services, maybe there would be viable other options with the idea that we should just basically tell people that you can take the car on the road as they are or you can just not drive, is an argument didn't say that facebook and for what it is worth facebook would love it and i felt f felt like all of the action and in terms of the platform was just for every person to do on his or her own without but what they really needed is a collective and pressure from not individuals but from democratic institutions and civil society organizations and that is the path forward, not describing these massive system problems as choices of consumers who be used are not be used. >> mls question, it was too early for the tech companies the
4:24 pm
detrimental effects from the personalization's or relatives of having avoided such problemsn the future before they scale. >> while in hindsight, it's a lot more there than foresight right, part of the issue in trying to go forward is understanding what can potentially happen and what you think would happen if the system we just kind of let people loose and information start to spread and then how responsive our week and back in the where people are beginning to call out the fact that there was election even before the election happened like the electoral process, the response from many of the big platforms was this really can't be that big of an effect, people will make the experience to make a difference there was a step decision when it came with even having this effect and i think the part of the issue there is not that we would expect
4:25 pm
companies to be able to figure out every possible negative impact pretty to think about these impacts i are into try to have the evaluations themselves up with their actually c&s isin like if you look at a lot of things done on some of the larger platforms around for example whether or notot there's bias or some sort of political bias and the kinds of information and then a done by academics at the journalist and why are they being done by the companies themselves rated actually have better access and it imagine the either and start to understand what is going on order we canan have a function this and these are the things you actually need to monitor to be able to understand what is going on. that lesson that are learned from the big tech platforms now, put in place and also make sense for entering into the space to
4:26 pm
be able to learn from the experiences to inform what we do in the future so this notion of the free-for-all we can just choose what app you want to use, rob is just talking about printed completely undermines the notion that we have lessons we have learned we should put them into practice and i think that is the place to get to plus learn from the past not ignore it and just have the same problems over and over. there will be place to park counties not get it right and to be responsive early on and to take a view of being self critical as opposed to self-aggrandizing in terms of what is actually being accomplished is not always just changing the world for the better and not always solving other people's problems but actually the fact that you are creating problems as well and owning up to it. >> again i thank you so really great and then a brief summary of some ways in which a book is about the there's much more and
4:27 pm
i think this discussion was really rich and thank you forri answering all the questions and for jumping back and forth. and before because were almost at the end, the computer exam has an initiative they call it a one word initiative so the idea is that you each wrote down a word or quickly right in one word of advice and it would be a type of advice and aware that you would give to young the person was starting out in their career and then once you share this onee word, please briefly about why it is the advice that you want to share and then we will talk with that. >> and my word, the young and an unexamined life is not worth living and of course the lesser-known parallel in the end lived life is not worth examining.
4:28 pm
and don't be an ethical sleepwalker and inquire within and examinesl with an and that s a pathway i think for entirety of life. >> wonderful jeremy pretty. >> so my word is go deep, tours and for me is motivated by the idea that when i was looking for my own career path but also as a professor and someone who served the senior levels of government i see them wherever i look people who have invested it in checking box, trying out everything, and presenting themselves as capable of learning something quickly and rapidly being reported. i was hiring people who got this and i was wanted to see as a young person, the person but gone deep and the reason is because you really want to solve
4:29 pm
the problem in the world pretty you have to get beyond the top layer of understanding and you have to go deep on the problem happy hundred exercise your muscles in terms of thinking about how to solve a problem you have to care as much about the ideas you do about the of limitation and so my advice is to the young people is to go deep and kick something that you care about and work on it in a real setting and try to actually move the needle on the outcome before you move on to your next thing pretty so that you bring into the world every subsequent job that you have, that you experience actually having seen something from start to finish. >> clinical that's powerful. >> and the work that i would use his kindness and so the reason why is twofold, one is for someone who is young and starting out career i would tell them a i little bit of kindness goes a long way and you have a large impact in life and it
4:30 pm
doesn't you anything to be more inclined but it will pay you for a long time and other people really tangible and meaningful. at the beginning of the pandemic, it's interesting because you saw a b lot of kindness going on in the world of people work watching out for other people and people who were cognitive of trying to help others who may be wanted in a situation as fortunate as they were to help that's one thing that we can take away from the pandemic and all of the problems that the causes and all of the stress that is created and there's relationships and this just to think aboutes a little t more kindness towards otheres people into ourselves because the world in recent use more kindness and does not cost you anything. >> thank you is a great note to end on and thank you everybody for your attention and for your questions and i'm sorryry that e cannot get to all of them are briefly give the screen back read. >> all of the best with the
4:31 pm
launch of the book and how many people will be encouraged to read it after this session. >> and i agree, thank you all so much for your time and the energy and the that you put into the creation of this wonderful piece. it is really clear that the alignment with the computer history museum in our belief that life as we know it does not exist about computing and what you've done is looked in a very very complement entered collocated set of issues many technology and anything that can be deployed as scale can be weapon eyes and inevitably will be and i think that the real analogy was particularly interesting for me as we think a little bit about history and applications of the present interaction dedicated provide forr the future and it turns ouo that 100 years ago or less, the core system pierced the corporate veil in terms of liability from auto dealerships
4:32 pm
into the manufacturers forsh the liability of through the automobile and accidents. and for our spiritual good issues associated with life as we know it regulatory social codes the values that were discussed, but equally important software code and the algorithms that are driving things in the future. and you believe that many of the points that are made in particulars or relative to the problems we are trying to solve and the types of achievable outcomes and in particular, thet effect in the workforce and workforce development in the human condition and nancy indications on i encourage all of you to maybe read the book and share and the rest to help facilitate a process whereby people have a sense of where they are in the world of computing and that they actually can have the decency to make a difference we get a wonderful program and i appreciate all of
4:33 pm
the authors work in a wonderful moderator program and thank you all for joining ch him and i encourage you to get a contribute and go to our website and if you love these programs can confuse or thank you everyone the best wishes. >> book tv, every sunday on "c-span2" teachers leading authors discussing their latest nonfiction books at 2:00 p.m. eastern, mystery writer discussed their international data care and 6:30 p.m. university of illinois journalism profession professor offers her thoughts on facing american journalism in iraq move for the rich what - and a 7:30 p.m. on about books, democratic congressman thoughts on opening a new bookstore plus bestseller list, new releases and other news from the publishing world is 10:00 p.m.
4:34 pm
and afterwards, and the latest k inside corporate america social justice and corporate america is only increasing process and keys interviewed by greg matthew harvard university college professor and former chair for economic advisors during the george w. bush administration. watch book tv, resend they on "c-span2" and on the fold schedule new program guide or watch online anytime at >> up to his neck in debt pretty. >> we believe that being an american as they were striving to provide equal opportunity for all citizens to c-span's video documentary documentary 2022, students across the country giving us behind the scenes look as though work on their entries using the # a student can have
4:35 pm
your high school student are middle schools student enter c-span student camp competition, create is five - six minute documentary using c-span's video clips, it is a income under government impact your life rated. >> be passionate about what you are discussing it no matter how large or small of the audience will perceive it to be. and know that in various countries and it doesn't matter pretty. >> to all of the film matters out there content is king and remember to be neutral and impartial as possible in your portrayals of both sides of an issue pretty. >> cspan awards $100,000 in total cash prizes and get a shot at winning the grand prize of $5000 entries must be received before january 20th, 2022. for the competition rules, tutorials or how to get started, visit our website at student
4:36 pm
camp .org. >> cspan on the go, watched today's biggest political events live or on-demand anytime anywhere on a new mobile video at, cspan now, talk highlights and is it is c-span radio, and discover new podcasts, all for free download cspan now today >> did you wonder what it would be like to go back in time to read the history and benefits from his lessons to shape the future while today we have the usual opportunities in 1985, one of the first reporters of the tech industry chronicled and when it began decades long story of people and companies in this book it counts the history "the big score" and a threat beginning and at a time where it was reay


info Stream Only

Uploaded by TV Archive on