Skip to main content

tv   Amy Peikoff  CSPAN  July 14, 2021 2:50pm-3:16pm EDT

2:50 pm
jersey. good morning. >> caller: i'll tell you what. you need to ensure an honest vote. if your vote is not honest, then it means nothing. and also, anybody who stops somebody from voting should face every fine and prison time. that's all i have to say. >> our last caller in this segment a. conversation now on social media and free speech with amy -- kickoff, of parler. explain what parler is. >> an up-and-coming social media network. we differentiate ourselves from other social media platforms around two prongs. first of all, we promote free speech. and by that i mean with respect
2:51 pm
to all legal speech, speech that is protected by the first amendment of the constitution. we believe that the user should be the one who is decide what they are going the see or say, read, hear, et cetera. the second prong is around user privacy. and we, unlike other social networks do not track you across website. and we collect and use the minimal data possible to provide good service and to also have some sort of a monetization model because of course we do need to make money. >> when did parler get started? and how many people use it today. >> it started back in 2018. i don't have exact user stats today. as you know, we were deplatformed unjustly by amazon and others back in january of this year. at that point, we had somewhere between 15 and 20 million registered users. and at that point, we then had
2:52 pm
to rebuild everything, the stack, and then also we've rebuilt the entire code base from the beyond up. we are in the ramp up phase again of recovering from that blow. >> why were you deplatformed? >> we were deplatformed allegedly because we had content on our site that contributed to the events of january 6th at the capitol. but, as we have seen in the months since the deplatforming, we know that the content at issue, this violent and inciting content, was all over the enter net, on other platforms. you know, there were stories in newspapers of all ideological leanings about facebook and twitter, particularly facebook. so the content was everywhere. the other thing that's come out in the months since is that parler had referred many example, dozens of examples of this violent and inciting content in the weeks leading up to january 6th. so the argument that we were
2:53 pm
somehow irresponsible with respect to this conthe end and that it had something to do with what happened on the 6th. that we somehow aided in it is disingenuous. i can understand back then maybe they believed that because of the trolls on twitter and that sort of thing. but it has come to light in the months since. >> how much content moderation do you do? >> the guidelines has to do with illegal speech, speech that's a rights violation. otherwise we have something called a trolling filter which will filter out what some people call hate speech, but what i have categorized i think more objectively, as attacks on the basis of race and sex and sexual orientation, et cetera. that trolling filter on the web and also on our android app is something that is in the individual user he is control as
2:54 pm
to whether to turn on or turn off. apple still requires -- i hope to change their minded about this, but apple requires that all so-called hate speech be removed from all ios apps. so trolling feature is turned on automatically on the ios app. with respect to all legal speech, anything that does not pose a threat or a violation of the rights of other people, that speech, we believe, should be controlled by the individual user, the individual user should decide whether it is in his or her feed. that's for hate speech, misinformation, any of that. >> for folks who might be more familiar with twitter or facebook, how do those guidelines that you just described compare to the amount of moderation and the rules on facebook and twitter? >> facebook and twitter have taken it upon themselves -- you know, this is what is at issue in the trump lawsuit, right? they have taken it upon themselves to remove or to maybe reduce the spread of so-called
2:55 pm
hate speech and also what he deem to be misinformation. it is going to be interesting to see going forward whether that practice is going to be scrutinized in the law or maybe it is going to be further encouraged in the law if some lawmakers get their way with respect to augmenting section 230. >> amy pickoff, i want to get to section 230. let me invite the viewers to join us though before we get into far into this conversation. if you want to talk approximate big tech, about social media, about parler, parler users, call in, split as usual. as folks are calling in, amy pickoff, dig in a little bit more into the trump lawsuit, and explain what section 230 is as you do that. >> okay. that's a big question.
2:56 pm
here i am, a never trumper. here i am i am in support actually of this lawsuit. the reason i support it is first of all, i believe it is correctly framing the problem. there may be some improvements that could be made in the way that the case is made. that is what -- did in the "wall street journal" recently is just explain how the case could be made a bit stronger. but the fundamentals of the case are good in the sense of saying that they are acting essentially as state actors. you know, we have got section 230 immunity, which i will explain in a minute. there have been threats to punish these actors in the case that they have not exercised that immunity, if they haven't censored the content. there has even been collusion
2:57 pm
for a choice of targets for censorship for either the removal or extension of the reach of the content. the lawsuit i believe is correctly framed. the other thing about it is that i think it has the chance to give us the proper remedy to this problem. because although i do think some government action to address this problem is necessary, i think the problem itself comes from an enainge tanglement between tech and government. if we aren't careful we are going propose as a solution to this problem something that will make it even worse and in fact give us big brother out of orwell's 1984, i think. i can explain more about that as well. you asked also about section 230. what is it about section 230 that potentially propose as problem? section 230 is a code section that grants legal immunity to platforms, social medias like
2:58 pm
facebook, like us. they are immune from liability for any content that is generated by users that is put onto the platform by users. you could not have such a thing as a platform really unless you have this type of immunity. i want to be clear, the core principle of section 230 is something i agree with. but i agree with justice thomas and others who have recently said that this immunity has been interpreted in an overbroad fashion. and so has left these platforms largely unaccountable for their contribution, whatever they are doing as distributors to distribute this content. and the trump lawsuit has a chance of holding them accountable for their conduct with respect to the restriction of the content. that's what he's concerned with. and to do so in a way that's in
2:59 pm
accordance with justice thomas's recommendations as to the property interpretation of section 230. again, i think that's the way to go. if you have a piece of legislation, if you start to classify them as common carriers, you are going to create further entanglement between big tech and government. and i think that is very dangerous if we are concerned about the ability to think freehly, express ourselves freely and maintain our privacy. >> a lot of focus on section 230 in on capitol hill. 26 words long, it states no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. those are the words that have caused so much focus, so much attention. we will talk about it here in this half hour with amy pickoff.
3:00 pm
plenty of calls already. we will get to them. joyce is up first in marlton, new jersey, a republican. good morning. >> caller: good morning. thank you for taking my call. i did want to tell that you i feel that you should have identification when you go vote so everybody knows that you are a participant in voting. and i also want to add one thing that i wish that vice president is picked by voting. >> joyce, we are talking about social media now. i know you are focus on that topic from our last segment on voting rights in this country. but do you use any social media platforms? >> caller: no. >> do you have any concern about the entanglement between big tech and government? >> caller: yes. >> what specifically, if you have thought much about it? >> caller: no.
3:01 pm
i think everything is wrong. i'm a republican, and i voted for donald trump. and i think he was the best president we ever had. >> that's joyce out of new jersey. anthony out of miller place, new york, is next. good morning. >> caller: thank you for the opportunity, c-span. my query is for both the guest and the moderator. perhaps you can follow up on my concerns. there was an at&t executive named mark cline who had a standing lawsuit with i think about 500 americans who had been unconstitutionally spied upon by big tech. and barack obama when he came into office his first signing statement was to dismiss that lawsuit, there by disallowing american citizenry to have redress against what is unconstitutional. the patriot acts one and two need to be rescinded, if not just reviewed, admonished and i would ask that you perhaps have terra frederick from the
3:02 pm
heritage foundation to speak on this topic from tech policy research fellow. the woman is brilliant and she knows so much about what is going on. i do believe this is orwellian beyond what george orwell would have imagined and i think america needs to fight back against big tech by either getting rid of their social accounts, turning off their phones for a couple of months. whatever it takes. but we are in deep trouble here. like i said, mark cline, at&t executive, he had been on c-span before. i would ask that you bring him back andk him his thoughts on what's going on as well as mr. smokan or julian assange, as you see there has been a lot of things framed around trying to frame those people who were really whistle-blowers on this whole cabal. i appreciate c san francisco and all the work that you are doing. all the callers that are critical of you guys it's because they don't understand the deluge of information you have to deal with every day.
3:03 pm
i am blessed to have you in my life. thank you c-span. >> anthony, office appreciate topic suggestion for this program, happy to take your suggestions. amy, it little you respond to the issues the caller brings up. >> sure. we can tie the privacy concern into this because my concern about social media and solving this problem correctly is the fact that social media both is a disseminator of crucial information -- and it's one that is widely used a. lot of surveys say most people get their news from social media. but second thing, at least that our competitors do, is collect tons of personal information about individual users. and so, again, if we solve this censorship problem -- and it is a censorship probable, i believe. again, i agree with ramiswami and rubenfeld who have written
3:04 pm
about this in the "wall street journal" extensively. it is a censorship problem. but if we don't handle it in a way that backs off that relationship between big tech and government, if we instead -- suppose we amend section 230 to give government more power to regulate these platforms, or we give government more power in the many ways that are before congress right now the break them up and do other things to supervise them and control whether they can have mergers and all of these things -- if we create more entanglement than you have government overseeing an industry that both dissem nats and collects information, that is a recipe for big brother. i don't agree with caller in the sense that -- we are not yet orwell's 1984 yet, but we are moving in that direction. and this is a crucial juncture where we have to make that decision. again, i believe a lawsuit framed the way that donald trump's is, has the potential to
3:05 pm
craft a narrowly talered solution to the problem that will solve it without creating the further entangle men that the caller is worried about. one caveat is that to the extent that the government regulates the business, it can demand records, ordinary business records of the business. and that's through something called the third-party doctrine. i have written about the third party doctrine. i believe i have a common law solution to the issue of the third party doctrine. it is near and dear to my heart. this doctrine allows government to obtain information from the third party, facebook or whoever, without a warrant. no probable cause, no particularized suspicion. if we have more regulation of these platforms, we will have more government access to that information. and it creates a further risk. i think there is plenty now, already, but it creates a further risk in terms of government acting as big brother, having access to personal information about you
3:06 pm
24/7. >> kurt on twitter has questions about parler, your business, asking does your website require users to provide social security numbers? who do you sell that information to? who are your financial backers? asking for foreign and domestic backers? >> donors, foreign and domestic, i can't answer. rebecca mercer was an initial seed investor and has been involved a lot more recently in helping to get parler rebuilt after all of this. but i'm not aware of the others, so i can't answer those questions. we do not ask for social security numbers. there are sometimes if someone is going to be part of an influencer network, you know, one of our aspects on parler is that when we monetize by targeting ads to a particular influencer's feed, we give that
3:07 pm
influencer a cut of the proceeds. that's been part of our business model. so unlike on facebook, where if i was to choose to place an ad that was going to target ben shapiro's audience for example, poor ben wouldn't get any cut of what i was giving in ad revenue. i was a small spendser when i used to spend on ads there. nonetheless, he should get a cut. and on parler, we would give people a cut. as part of that, we would have to collect -- i forget what the form is, the w-9 or something that you would have to have for faction purposes and then someone to have to give a social or a taxpayer id for an entity on a form like that. but, no, we don't collect social security numbers. that's a myth. >> what makes somebody an influencer? >> what makes somebody an influencer? i think that's an open ended question anywhere out there. i think as we are rebuilding, another thing we are doing at
3:08 pm
particler is rethinking what is an influencer. in some sense of course it has to be an agreement between two people. both parties want to be involved in this arrangement. but it has to do with having a certain amount of a following such that it would be worthwhile to engage in a business arrangement so you would be targeting their following, et cetera. the criteria are still this the works right now as we are rebuilding. >> is the website. @parler underscore at on two. and amy pick hf on twitter as well. on taking your phone calls. julia is in goalendale washington, a republican, your comments are next. >> caller: i would like to see free speech on all social media. we need that. >> julie, define free speech a little bit more. any limits that you would place? >> caller: to be able to get on
3:09 pm
social media and speak your mind as a free american. >> amy pickoff? >> on parler the way we think about it is embracing the entire first amendment of the constitution. you know, the first amendment is at issue in trump's lawsuit. but traditionally, we don't think of private businesses as, you know, being governed by the first amendment. nonetheless, parler's mission has always been to operate our platform in the spirit of the first amendment. of course what that means is with respect to legal speech, all legal speech that does not pose a flet -- threat to or violate the rights of other people then the control of that speech should be in the hands of the individual user. this is where the other aspect of the first amendment comes in, which is freedom of association. we also want to respect the ability of users who come on
3:10 pm
particler to curate their own experience, to curate their own feeds. so a lot of people don't want the see the trolls. they do not want to see, necessarily, what a lot of people call hate speech. they don't want to see attacks on the basis of race, or sex, or sexual orientation or religion. so what we like to do is work with a vendor who is skilled at this and you know give the tool to the individual user to decide whether to turn on a filter for this or not. and they can turn it on in way that puts a splash screen over the content and then click on the content if they want to go ahead and check our work and see if we are being objective about this. but we want to let the user, the individual user, control whether they see legal speech or not on parler. as for all platforms having free speech, you know, some of that kind of discussion can need towards again what i am worried about, which is that you are going to come in with a broad government regulation that is
3:11 pm
going to have government supervising what these platform does. and i would rather leave to it the free parkt market. we at parler think in the long term we are going to win with the free speech model. but let the market decide that. >> we asked what makes somebody an influencer. what makes somebody a troll. >> trolling, again, we are -- have a narrowly defined category for that. and it is attacks, where somebody is just making attacks on the basis of race, sex, sexual orientation, religion. these are ad hominem tax if you have taken your introductory logic class. they do not add anything of substance to productive discussion. many people complain you go out on internet and its wild werks it is horrible and us pleasant. people particularly complain about twitter with respect to that. but there are sure other places as well. people realize it doesn't help if your goal is to improve civil
3:12 pm
discourse among people with competing view paints. it is logically irrelevant. yeah, it would be nice to have a tool to filter that oust your feed. that's what we try to provide with you. >> steven of edge water florida is next. democrat, good morning. >> caller: yes. my concern is like i can't go into walmart and do something dangerous, go into a sporting department and get a gun and ask for bullets, put bullets in and it do something dangerous. both myself and walmart would end up in a legal situation. i think the same thing should apply to social media. if a group uses parler or facebook to organize something like january 6th, i think you guys should be subject to legal
3:13 pm
responsibility. so that's why i agree with facebook's attitude on censoring or, you know, cutting off donald trump's facebook account, to prevent this kind of dangerous stuff from going on. >> miss pickoff. >> that's packed with a lot. donald trump himself is not responsible for all this content, first of all, i don't believe that. he would be welcome on parler. second of all, with respect to facebook, if you read, again, the news that has come out in the months since january 6th, facebook had tons of this type of content on their platform. if you look at the indictments for the people who have been charged with crimes in connection with january 6th, facebook is all over those documents. i am not going to do a numbers comparison because we are a smaller platform. but there was plenty of that content over there, and facebook
3:14 pm
has plenty of resources and has actually bragged -- i am going to use the word -- they have bragged about their ability to identify and filter so-called hate speech. and they seem to be just asking congress to regulate it, which, again, i think is wrong. justice clarence thomas made a very interesting statement in connection with the denial of serbiary in malware bytes. in that statement he argued for a narrower interpretation of section 230. this is what i hope is going to be come out of the trump lawsuit. that it will be put into practice. with other lawsuits, too. i hope they are going the listen to this. he talks about distributor liability that a platform like parler or like facebook could be held liable for their actions as a distributor of content. and the standard that he articulated in there is whether the platform knew or
3:15 pm
constructively knew of this rights-violating content. you have to know that as a social media platform it is impossible to be perfect at this. right? i think that's another thing that congress knows. it is impossible to be perfect at this. if they do put an impossible standard in front of the platforms that gives them -- >> you can see the reftd of this washington journal segment on our website right now, we will take you live to a senate judiciary subcommittee hearing on voting rights. >> i do want to thank all of you for being here today. yesterday, president biden issued in effect a call to action an urgent plea to protect democracy amidst an onslaught of state laws restricting voting rights. he declared that,


info Stream Only

Uploaded by TV Archive on