Skip to main content

tv   Going Underground  RT  October 18, 2021 2:30pm-3:01pm EDT

2:30 pm
if the goal a i presumably different to the squid game, i haven't seen all of that series, but everybody's talking about, what is it? and the cal ai is, i consider it to be a field that tries to ensure that while we work on a technology, we're working on it with foresight and trying to understand what the negative potential negative societal impact to are and minimize dose and try to work on something that's actually beneficial for humanity. so even started by saying what is on ethical ai? there's lots of it, i think. all right, i think that whenever you, so for example, as a part of my work taught showed that my collaboration with joy pull me showed that a lot of a p i that sell automatic automated facial and that is tools. we showed that they were much there. they had much higher error rates for darker skinned women
2:31 pm
than lighter skinned men. this is for, for ignition. yeah. yeah. and that's bird. a lot of movement because a lot of people have been worried about surveillance related technology anyway. so that spurred a lot of movement to ban some of these, they use of some of these technologies by law enforcement, because they're mostly used to suppress dissent to surveil a lot of marginalized communities. and so for me, that's an ethical, that's not ethically i right, that is an ethical ai. but even for capitalism, it's not that useful or law enforcement. it's not useful to have a i that wrongly identifies individuals on the basis of their color. and i think the amazon company did initially fight against a criticism of u. s. law. and yes with them they to present me so well. you know,
2:32 pm
no one wants that. i don't, i don't think they saw the light. i think what happened is that there were a worldwide like lives matter protests. and so it looks bad for them to continue using it amid this world wide protest. so they said that they were going to have a moratorium of using this technology until there was a federal law. i think you all right, yeah, i have a statement here, lou said quote this research paper and i'm assuming you were cool that article or misleading and drill false conclusions. i mean, why are the things the researchers use an outdated version of amazon recognition? we made a significant set of improvements. i mean, what do you think, why, why do they fight back so strong lives against dis, interested, scholarly research, yolanda. i mean, this isn't you being an activist for blacklist mother this is you analyzing the algorithms? yes, so this was actually my colleagues, so joyful, i me and i have wrote a paper that preceded this one and her and another black woman at deborah raji
2:33 pm
wrote this paper called actionable auditing and them. and when they put it out there for the very 1st thing that amazon wanted to do was push back. so v p after v p wrote a blog post trying to discredit them. and me and my colleague mag, mitchell, you know, we're both fired from google. now that we were a google at the time, we spent a whole lot to writing a point by point rebuttal and then and galvanized academic community to come to their support. so we had a whole petition that more than $7080.00 academics including tearing award winners . that's kind of like the equivalent of the nobel prize for computer science reg signed our letter. so we came to their defense, we debunked. we had a point by point rebuttal of what they were trying to say here. and then we all asked them to stop selling recognition to law enforcement. so then that was all over the news and they weren't able to far, there are sort of try to discredit them. but the thing that so what you just saw
2:34 pm
there is very similar to what google was trying to do to me right after i came out with my paper about large language models because we were trying to show the harms of such technology. well, let's get where we were, we'll get on the la language models and i have to say that these are massive companies that you know, some people say own all alive. so i better say these responses because they have a lot of money. you know, before you make actions that they did, i completely that google says they didn't fire you at all to need wrote that if we didn't meet the demands, she would leave google and worked on an end date. we accept and respect the decision to resign from google. so the, one of the gods of our societies are saying, no, you didn't, you went for it. i think they would you know, they had a could and could, and, and non apology of an apology after the backlash because they were just doubling
2:35 pm
down. ok. it's so clear, it's high to resign. i can't even believe. i don't know why they thought that they could get away with saying that maybe they thought i'm so stupid that i would just kind of feel like, oh, i guess i resign, that's not how resignations, work right resignation. so you have to submit. you have to have paperwork, you have to, you know, say you're going to resign, you have there is a whole process and ok, well they, then that's a dispute. i mean, as i say, they say that why did you, according to google resign, according to you were fine, i say that i resign. well, what is that? i'm not going to happen in the context of all of this. what exactly happened? because again, you're working on scholarly research, as far as i understand it, about large language models and people who use google translate love, love these google translations. and that uses large language, what could possibly be wrong with large language models. so in all of these incidents that you see, whenever you show up problem and it's inconvenient,
2:36 pm
it looks like the problem maybe is too big for what they want to admit or, or maybe it's not even that bad. like they maybe they really do think that that the issues that i'm raising are not that serious. the one of these they said is that my paper made it look like there were too many issues with lars language models, right. they say it's paint a stark picture, and so they wanted me to retract this academic peer reviewed paper that was being published at an academic scientific conference with no process whatsoever. so it's one thing if somebody, if your employer comes to you and says we followed this particular process that you know about to come to this conclusion that your paper should be retracted. and here's why we can discuss it. let's have a conversation about, you know, what kind of process we use, why we want you to retract the paper?
2:37 pm
absolutely not. right? this was way late after we submitted it up a long time after we submitted our paper. and we went through the internal approval processes. my manager resigned from google, right? because he was approver, okay, well, and it was fine if we go to the specifics of the lodge language model of problems that you were investigating. maybe we can figure out what so does question. i mean, i know you give an example of a palestinian using woods and then being wrongly interpreted just just to take a few. yeah. so what we did was we went through actually prior works surveyed a lot of prior works and had some initial r on analysis of these large language models and what, what has been going wrong and what could go wrong if we focus on larger and larger language models, so the 1st point that we started with was environmental and financial cost. right?
2:38 pm
these large language models consume a lot of compute, consume a lot of compute power. and so if you are working on larger and larger language models, only the people would, these kinds of huge compute powers, are going to be able to use these large language models. and that leads what to what we talked about as being environmental, racism. the people who benefit from large language models are not the people who are paying the cost of the, of this environmental and financial cost of large language models. and so we give a bunch of examples there about different languages, re, woods. it's always people in the dominant groups, whether it is between countries or within a specific country that benefit from these large language models. the 2nd point we make is the fact there is this weird that there is this assumption, that if you use the data on the internet, that you'll somehow incorporate every one's point of view. right?
2:39 pm
and those of us who've been harassed on social media and know that that's not the case, but these large language models are trade with data on the internet and most of the time like the whole internet events. and so we were talking about the dangers of doing that, the kinds of biases that you would code, the kinds of hateful content that you and code. we were saying that just because you have a large sized dataset, it doesn't mean that you are now incorporating every once point of view. so we give many examples of that. and then we, we give examples of what happens when this and coded bias at when this large language models trained with data encoded in this, with this kind of bias are deployed. so we give certain examples. one of these examples was this example that you gave about a palestinian. it was this a, he wrote good morning on facebook, and it was translated to attack them. and people didn't even see the
2:40 pm
initial, didn't check to see what he initially thought a wrote. they wrote, they saw the translation and they arrested him. so this was a google translate era. this was, this was a facebook, a underlying technology of large language. did they all use large language models? so what we were saying was sometimes when you have machine translation, you get these cues when you have when, when you have errors, right? you can see that the grammar is not quite right. you can see that something is wrong with these large language models. you can have something that sounds so fluent and coherent and it's completely wrong. so you only with these companies. but why would these companies visiting? there's no malice. there is a mistake in the algorithm. and in the, in the software engineering, why would they seek to minimize the publicity given to papers that showed
2:41 pm
these areas so that they could, i went in. i would disagree that there is no malice because when you look at what happened with joy and deb, for example, it is 2 black women talking very much about the impacts of amazon's technology on the black population. so if you, you see a pattern here, it's me and other black women and a bunch of other people in our team who are very much concerned with the impact of these large anguish models on marginalized communities. and so if, if you're talking about, you know, something that maybe should not be used right now. that's directly going to impact their money. it's a money making machine, i see what you mean there, but how good morning can translate into them. why would, why would,
2:42 pm
why would the algorithm even work that out based on the dataset? because of the domain from where the user came from? because a, if you look at some languages that does an event. so on, on twitter, twitter users, google translate, and i had an interview on b, b, c with integration, which is my mother time. and a google translate doesn't even have to going out. they don't, it's not part of the language is that they offer a translation to, but it uses the same alphabet as all right. and when people were lukea were sharing my interview, it just went haywire and just says there it just when you know, let's talk about the greedy people, greedy, greedy, greedy greeting, greedy, really, i like there is not i, there is nothing in my interview. i talked about really has given the u. s.
2:43 pm
military, industrial complex is particularly interest in the horn of africa and what's happening in eritrea review. yeah, that's actually more disturbing this it 1st that it 1st just sounds to me. i'll stop you that more from the former colleague of google's ethically i team and co founder of black and i. after this break, there are growing indications that washington isn't finished with afghanistan just yet military involvement has come to an end, but not engage. also we are told there is an energy crisis. maybe this is part of the great with the choice of stuff but also within the
2:44 pm
daniels trulia with it because i just glued say the game and then you would you that is images levels of was good for supposed to have my did some i would say again, it's been communities in which is spreads feel like i'm on my sites with was out of the to get the vote. if i deal with all of your group plan and some way up all up with the to the shelf worship. with 100,
2:45 pm
i make no certainly no borders line to tease and you as a merge, we don't have with ultimate vaccine. the whole world leads to take action and be ready. people are just come in, we can do better, we should be better. everyone is contributing each in their own way, but we also know that this crisis will not go on forever. the challenge is great. the response has been massive. so many good people are helping us. it makes us feel very proud that we are in it together with ah, welcome back. i'm still with dr. tim need good brew,
2:46 pm
former co lead of google's ethically i team. i'm co founder of black in a. i mean, is it ineffectually translation or is it, is that process? does it violate the 1st amendment of the us constitution? lemme because it is it's, it's not going through speech and effect. oh, i've never thought of it that way at all. i don't know. i mean, i guess i'm not a legal scholar, but back to the question that you asked me in terms of ethical and what is ethically i, if you read the works of some critical scholars, they would say that the tech industries strategy is to make it look like a purely technical issue, purely algorithmic issue that needs to be fixed. as you said, purely mathematical issue. it has nothing to do with antitrust laws. it has nothing to do with monopoly. it has nothing to do with labor issues or power dynamics is
2:47 pm
just purely this, you know, technical thing that we need to work out. and part of the things that we say in our paper and many others is that is not necessarily this purely technical. you know, algorithmic tweak that you want to do, right? we need regulation. and i think that's part of why all of these organizations and companies want it to come down hard on me and a few other people. right? because they think that what we're advocating for is not this simple algorithmic tweak. it's for larger structural changes that does issues to realize that the, the way to really answer the problem is as you say, to do with the nature of monopoly power. yes, i think now with the whole, all there's so many whistleblowers now since i, since i 1st came forward, there's one after another after another i think the public is now starting to really understand and that we need some sort of regulation that in. but the way that what the companies would argue is exactly what you were saying earlier. oh,
2:48 pm
there is no malice. it's just, you know, the algorithm is not this. why would we want to have algorithms that don't work that's not good for our business. and so of course, we will work really hard to fix our algorithms, right? so for example, if for facebook mark 2nd requesting a safety, i don't know any tech company that was to create a he was a all our advertisers don't want to advertise the angry people or bad content be constantly tell us that. so why would we want to do that? that's a kind of argument that they're making, right? but what i'm saying is that, that's, you know, a lot of the work in this, in this space. interdisciplinary work has shown that we need to. we need to look into more structural and solutions to this problem. now of course you were google for quite a while. i julie the sun. you know, when he met eric smith wanted to meet him, julius on said, you know,
2:49 pm
google may not be acting illegally in any of this. presumably he's referring actually to the scientific element as opposed to the antitrust element. who knows, which is why the f b, i don't investigate. when you write a paper i would assume, did you come across a guy called jared cohen, c of jigsaw, back then it was cool. google. i do remember i didn't come across him, but i heard about him. i was, i was i google for actually 2 years and 2 very long years. every month i was thinking can i survive here? should i leave next month? but i know a lot of people, what you will be thinking must be amazing to work that amazing office and all of that. that's the future there. graduates watching it, desperately trying to get through a complicated process to get in. what is i mean? and i should say jerry cohen back by joe biden, blink and the secretary of state and the national security advisor, jake sullivan. they love google ideas and google very much associated with the democrat royalty as it were. why,
2:50 pm
why was it no fun to work? well, again, it comes down to what your views and the demographic that you are in and how, how, what you think i was, you know, in a role that's called a research scientist role. and i was the 1st black woman to be a research scientist at google and i saw why, right? i had so many issues right off the bat i was under levels. are people with much less experience to me way more level than me. there was a lot of just kind of disrespect. there was a lot of harassment and all sorts of things. so it was very, very difficult for me to just even concentrate on my job. and actually when i'm talking about being fired, i'm pretty certain that it wasn't just that paper that got me fired, right. i was, there were so many things that they didn't like that i was doing. i was speaking up about workplace issues, discrimination issues. and they and each time i spoke up,
2:51 pm
they weren't happy that i spoke up about it. so i didn't buy the idea that there is something philosophically ayn rand, this kind of right wing supremacy about it, cuz they deny me. obviously they say you resign and they deny any element, workplace being a bad place to work, let alone harassment or anything like that. yeah, i mean, they also, i don't know if you're aware, if the google walk out there was a large 20000 people walked out 2 months after i joined google. there was a large, 20000 people demonstrated, walked out. because they were demonstrating against, you know, andy reuben, being leaving with the $90000000.00 or something like that after you know, here, harass people. right. and so there were $20000.00 people walking out in protest, and they pushed out that 2 of the organizers, claire and meredith, you know, after like a year or so afterwards. so they were purging and then a year after i,
2:52 pm
before i got fired, they pushed they fire the fired 5, there's an l r b trial going on. right now. the national labor relations board trial about these fired 5 and then right after one year later they fired me and i spoke up about all of those firings, google, denial, wrong doing all of those things in there. so powerful. i mean, you know, when you just mentioned these cases, obviously join us. you'll have to look them up using, using google, but as for diversity and identity politics, they go. a 9th, december 2020, the millionaire. all the god saying it's incredibly important to me that all black women and under represented google's people who work at google know that we value you and you do belong in google. we started a conversation together early this year when we announced a broad set of racial equity commitments to take a fresh look at all, assistance from hiring and leveling to promotion and retention to address the need for leadership accountability committed to continue to make diversity equity
2:53 pm
inclusion but of everything we do from how we build up products, i presume that some of the hardware over represented by people of color, who knows that to how we build our workforce. you know, you know, the statements coming from google completely against what you're saying. yes. so maybe this was what he did, the quote non apology and you know, he had to apologize because the other road they were going down on, which was deborah lane down, you know, to saying that my work was subpar saying that i, i told people to stop working on diversity and all of the other things they were saying it was creating more and more backlash each time they went down that road. so i'm sure he realized at some point that he had to do some sort of an apology. and if you read closely the apology, he, it was the kind of apology for it that said, i'm sorry for how you feel kind of apology. there was a long time. and what is it called a person who a comic like a person who draws these, these cartoons that cartoon is, there is
2:54 pm
a long time cartoonist at google life. i who left after 1415 years because of what they did. okay. well then later, i think if you had thousands of people on your side, they find less is and maybe 1000 right now. i just want to, i mean, i just want to ask you about, i mean, we were talking about joe cohen and i don't know what you think about jigsaw and why using jigsaw or a is kind of a competitor. now you're the co founder of black in a i, you're gonna have to tell me about what black in a i is. why is your a i going to be better than ai? ai? certainly the innovation department of the google conglomerate. why is it going to be better'n and you're going to make it more profitable actually, because you're going to be more accurate and you're going to get the cia national security agency i. so for, for the group black, black in a i that i founded i co founded was a, a group to it was a, it's
2:55 pm
a non profit. and it's a group for practitioners and researchers in a i black partition, nurse practitioners and researchers. any i from all over the world. so. so this is not like a group that builds a or anything like that. there's a group that builds mentor builds community networking, has mentorship programs, a for various things, like graduate mentorship programs, entrepreneurship program, and then we have workshops to raise the visibility of black people in a i, et cetera. so this is different from what i'm doing right now. and in and so what i'm, what i'm hoping to build right now is an interdisciplinary. i research team and the goal is not to be an extremely profitable i research institute because i believe that if your goal, your number one goal is to maximize profit,
2:56 pm
then you're going to cut corners and you're going to do things down stream that end up making you build a i that is, i think you'll find section 70 to the 26 companies act fiduciary duty company to maximize profit love. we're not happy with the maximize profit very quickly. well, with a whistleblower right now, who's terrified and he worked one of these organizations and even seeing what you're talking about feels more terrified and never wants to come clean with journalists or anyone else about yeah, lack of ethics that they see in their daily workplace. well, i would ask them to look at the handbook the tech work or handbook that if your mom was de la, who is another, was deplored, just launched like last week. and so that handbook is meant to help people decide whether they want to come forward and what to expect. and i think
2:57 pm
it's a very personal thing for each person, because it's not a joke. you're going to really, you're gonna have a lot of backlash. so you have to determine whether this is a right avenue for yourself. and so i understand the fear because you, you get a lot of harassment being 1st into the public space. but i'd want to say that, you know, you were mentioning eric smith and all a julian assign, etc. before eric smith in particular has a lot of influence within the u. s. government right now he created this and see a i this national a, i kind of committee or something like that. and so he is a very, he is this, we view that we have a cold war with china and there's an a, i raise, et cetera. so i think that where we're going with this can be very, very dangerous. so if you are voices or the whistle blowers voices are not heard. it's these people at the top who are have a, it's a lot of influence right now. and i think that it would make
2:58 pm
a huge difference for more people to speak up at the same time we have to protect them. so i can go ask, so to speak up without making sure that the society at large will also protect, protect them. well, we invite to hold as a google board members, people who left people in the government. now the come on doctor to need gab route . thank you. thank you. that's it for the show where we back on wednesday, 10 years to the day since the leader of africa's riches per capita, country libya was killed with the backing of the u. k. u. s. a in france. until then keep in touch with social media and let us know if you think you protections and needed focus on those. when i see black america listen part of my so when i was growing up like america spoke to me. when why destroyed it did not go through side black,
2:59 pm
my magic is a movement we are importing from america. no, nothing of who we are. i lived in a world where wide lives mattered. and i was not wide like ms. newman and i wasn't new from black america. i learned how to speak back to whitefish aboriginal people. here i'm law every day. we are out now with the police were at war with statistics. i'm scared that more children are going to grow up in the country that think says no racism, but they're more likely to end up in the criminal justice system. then there are other fellow friends in daycare. there are growing indications that washington isn't finished with afghanistan. just yet military involvement just come to an end, but not engage. also,
3:00 pm
we were told there was an energy crisis. maybe this was part of the great. ah, ah rushes to suspend its permanent mission to nato. from next month, the moves the direct response to the military alliance recently ticking out 8 russian diplomats ready for action. russia fills one section of the nord stream to pipeline with natural gas and awaits the green light from regulators to start supplying europe that says, the commissioner warns energy poverty through the continent is on the rise, tortured and jailed for 17 years without trial. we explore the case of pakistani national, who still in guantanamo bay, despite being cleared for release after its submerged, he'd been mistaken for a terrorist. and a former researcher at google accuses the tech giant of using racist algorithms.
3:01 pm
she revealed it all to ortiz going.


info Stream Only

Uploaded by TV Archive on