Skip to main content

tv   Washington Journal Amy Peikoff  CSPAN  July 15, 2021 2:52am-3:36am EDT

2:52 am
>> "washington journal" continues.
2:53 am
host: our conversation now on social media on free speech with amy peikoff, chief policy officer for parler. explain what parler is for those who may not know. guest: parler is an up-and-coming social media network, and we differentiate ourselves from other social media platforms around two prongs, first of all, we promote free speech, and by that, i mean with respect to all legal speech, speech protected by the first amendment constitution, we believe that the users should be the ones to decide what they're going to see or say, etc. the second prong is around user privacy, and we, unlike other social networks, do not track you across websites, and we collect and use the minimal data possible to provide service and to also have some sort of a
2:54 am
modernization model. host: when did parler get started and how many people use it today? guest: it started in 2018. i do not have exact user stats today. as you know, we were deplatformed unjustly by amazon and others back in january of this year. at that point, we had between 15 million and 20 million registered users, and at that point, we had to rebuild everything, and then we rebuilt the code base from the ground up, so we are in the ramp-up phase with recovery from that blow. host: why were you deplatformed? guest: allegedly because we had content on our site that contributed to the events of january 6 at the capitol but as we have seen and the months since, the inciting content was all over the internet on other
2:55 am
platforms. there were stories in newspapers on all ideological leaning about facebook, twitter, particularly facebook, so the content was everywhere. the other thing that has come out in the month since, parler has referred to many of the examples, dozens of examples of this content in the weeks leading up to january 6, so the argument that we were somehow irresponsible with respect to this content and that it had something to do with the events of the six that we somehow aided and that was a disingenuous argument and people still hold it. i could understand if maybe they thought that and they were reading the trolls on twitter about us or something, but i think the record has shown since then that isn't a valid argument. host: what about guidelines for the parler user today and how much content moderation do you do? guest: so, the guidelines just have to do with illegal speech, speech that is a rights
2:56 am
violation. otherwise, we have something called a trolling filter, which will filter out what some people call hate speech, but what i have categorized more objectively as an attack on the base of race, sex, etc., that trolling filter on the web and android app is something that is in the individual user's control on whether to turn on or turn off. apple still requires, and i hope to change their mind, but apple requires that hate speech be removed from all ios apps, so the trolling filter is turned on automatically on ios app. with respect to all legal speech, anything that doesn't pose a threat or violation of the rights of other people, that speech, we believe, should be controlled by the individual user and the individual user
2:57 am
should decide if it is in his or her feed, and that is where they look at that information or any of that. host: folks more familiar with twitter or facebook, how do those guidelines compared to the amount of moderation and the rules on facebook and twitter? guest: facebook and twitter have taken it upon themselves, this is what is at issue in the trump lawsuit, right, they have taken it upon themselves to remove or to maybe reduce the spread of so-called hate speech and also what they deem to be misinformation, and it is going to be very interesting to see going forward whether that practice is going to be scrutinized in the law, or maybe it is going to be further encouraged in the law if lawmakers get their way with respect to augmenting section 230. host: amy peikoff, let me invite the viewers to join us before we get too far into the conversation, if you would like to talk about big tech,
2:58 am
parler, democrats, (202)-748-8000. republicans, (202)-748-8001. independents, (202)-748-8002. as folks are calling in, amy peikoff, digging a little bit more to that from lawsuit -- that trump lawsuit and explain what section 230 is as you do that. guest: sure, ok, a big question. to be clear, i am a never trump or, so here i am, in support of this lawsuit, and the reason i support it is my first of all, i believe it is correctly screening the problem. there may be some improvements that can be made in the way the case is made and that is what happened in "the wall street journal" recently, explain how the case could be made stronger, but the fundamentals of the case are good in this sense where they are saying they are acting as state actors and we have section 230 community, which i will explain in a minute.
2:59 am
there have been threats to punish these actors in the case that they have not exercised that immunity, they have not sent to the content in accordance with their wishes, and this is -- they have not censored the content in accordance with their wishes, and there is even collusion with certain bureaucrats in the federal government with respect to our choice of target for censorship for either the removal or restriction of breach of content. so, the lawsuit i believe is correctly framed. the other thing about it is i think it has a chance to give us the proper remedy to this problem because even though i do think some government action is necessary to address the problem, i think the problem itself comes from an entanglement between tech and government, and if we are not
3:00 am
careful, we are going to propose as a solution to the problem something that will make it worse. in fact, it gives us big brother out of george orwell's "1984," and i can extreme more about that. what is it about section 230 that potentially poses a problem? it is a code section that grants legal immunity to platforms, social media's facebook, like us , they are immune from liabilities from any content that is generated by users, that is put onto the platform by users. you cannot have such a thing as a platform really unless you have this type of immunity. i want to be court at the core principle of section 230 is something i agree with, but i agree with justice thomas and others who have recently said that this immunity has been interpreted in an overbroad
3:01 am
fashion, and it has left the platforms largely unaccountable for their contributions, whatever they are doing as distributors to distribute this content, and the trump lawsuit has a chance of holding that accountable for their conduct with spect to the restriction of the content. that is what he is concerned with, and to do so in a way that is in accordance with justice thomas' interpretation of section 230. that is the way to go, if you have a piece of legislation, if you start to classify them as common carriers, you are going to create further entanglement between big tech and government, and it is dangerous we are concerned about maintaining our privacy. host: a lot of section 230 focus
3:02 am
in the big tech industry, and it states that no provider or user of an interactive computer service shall be treated as a publisher or speaker of any information provided by another information content provider. those are the words that have caused so much focus and tension. we will talk about it for an hour with amy peikoff, chief officer of parler, and plenty of calls already. joyce is up first in new jersey, a republican. good morning. caller: good morning. thank you for taking my call. i did want to tell you that i feel like you should have identification when you go vote so everybody knows that you are a participant in voting, and i also want to add one thing that i wish vice president is --
3:03 am
host: we are talking about social media now, and a new you are talking about voting rights in this country, but do you use social media platforms? caller: no. host: you have concern about the entanglement between big tech and government? caller: yes. host: what specifically, if you have thought much about it? caller: no, i think everything is wrong. i am a republican and i voted for donald trump, and i think he was the best president we ever had. host: joyce out of new jersey. anthony out of new york is next. caller: thank you for the opportunity. my query is for the guest and the moderator, perhaps you can follow up on my concerns. there was an at&t executive who had standing lawsuit with about 500 americans who had been
3:04 am
unconstitutionally expired upon by big tech, and barack obama, when he came into office, his first signing statement was to dismiss that lawsuit, by disallowing the american citizen to have redress against what is unconstitutional. the patriot acts one and two need to be rescinded, if not reviewed, admonished and i would ask that you perhaps have cara frederick from the heritage foundation to speak on this topic. the woman is brilliant, and she knows so much about what is going on. i do believe that this is orwellian beyond what george will could have imagined, and i think america needs to fight back against big tech by either getting rid of the social counts -- social accounts, whatever it takes, but we are in deep trouble. like i said, mark klein, at&t executive, he had been on c-span before, and i would ask you
3:05 am
bring him back and ask his thoughts on what is going on, as well as mr. snowden or julian assange. there have been a lot of things framed around trying to from those people, which are whistleblowers. i appreciate c-span, and you guys keep up the great work you are doing and all the callers who are critical of you guys, it is because they don't understand the delusion of information you have to deal with each and every day. i am blessed to have you in my life. host: thank you. host:anthony, always appreciate topic suggestions for this program, happy to take your suggestions. amy peikoff, i will let you respond to the several issues the caller brings up. guest: we can tie the privacy concern into this because my concern about social media and solving this problem correctly is the fact that social media both is a discriminator of crucial information and one widely used. a lot of surveys say most people
3:06 am
get their news from social media, but the second thing that our competitors do is collect kinds of personal information about individual users. again, if we solve this censorship problem, and it is a censorship problem, and i agree with "the wall street journal" on the censorship problem, if we do not handle it in a way that backs off that relationship between big tech and government, if we, instead, you know, supposedly, and section 230 to get government more power to regulate these platforms, or we give government more power in the many ways to support congress right now to break them up and do other things and how we have mergers and all of these things, if we create more entanglements than you have
3:07 am
governments overseeing an industry that both disseminates and collects information, that is a recipe for big brother. i do not agree with the caller and a sense that we are not at orwell's "1984" yet, but we are moving in the direction. this is a crucial juncture we have to make the right decision. i believe a lawsuit, framed the way donald trump's did, has the potential to crack a narrowly tailored solution to the problem that will solve it without creating the further entanglement the caller is worried about. one issue in respect to privacy and government regulation is that to the extent that a government regulates the business, it can demand ordinary business records and that is through something called the third-party doctrine. i have written about it extensively. i believe i have a common-law solution that is near and dear to my heart. this doctrine allows government
3:08 am
to obtain information from the third-party facebook or whoever without a warrant, so no cause for suspicion. if we have more regulation of the platforms, we will have more government access to that information, and it creates a further risk. i think our planting our already, but it creates a further risk in terms of government packing -- hacking. host: on twitter, they are asking does your for-profit website require users to provide social security numbers, who are you selling that to? who are your financial backers, foreign or domestic? guest: financial backers, foreign and domestic owners, i cannot answer. rebecca mercer was the initial seed investor and has been involved more recently in
3:09 am
helping to get parler rebuilt after all of this, but i am not aware of the others, so i cannot answer those questions. we do not ask for social security numbers. sometimes if someone is going to be part of an influencer network , one of the aspects on parler is that when we monetize by targeting as to particular -- ads to particular users feeds, we give them a cut of the business model. on facebook, if i was to choose to place an ad that would target then shapiro's audience, he would not get any cut of what i was giving an ad revenue. i was a small spender when i spend on ads there, but nonetheless he should get a cut. but as part of that, we would have to collect -- i forget what the form is, the w nine or
3:10 am
something for tax purposes, and then someone would have to give a social or taxpayer id for entities on a form like that. but, no, we do not collect social security numbers. host: what makes somebody an influencer? guest: what makes somebody an influencer? that is the question anywhere out there. as we are rebuilding, another thing we are doing at parler is rethinking what is an influencer and, you know, in some sense, it has to be in agreement between both parties who want to be involved in the arrangement. it has to do with having a certain amount of followings such that it would be worthwhile for a business arrangement that you would be targeting or following and the criteria is still in the works right now. host: parler.com is the website,
3:11 am
@parler_app on twitter, and amy peikoff is taking her time to talk about parler. julia is in washington, a republican. good morning. you are next. caller: good morning, i would like to see free speech on all social media. we need that. host: define free speech a little bit more, any limits you would place? caller: to be able to get on social media and speak your mind as a free american. host: amy peikoff? guest: on parler, we think about it is embracing the entire first amendment of the constitution. the first amendment is an issue in trump's lawsuit, but traditionally, we do not think of private businesses as being governed by the first amendment. nevertheless, parler's mission
3:12 am
has been to operate our platform in the spirit of first amendment, and that is with respect to legal speech, all legal speech that doesn't pose a threat to or violate the rights of other people, then the control over that speech should be in the hands of the individual user. this is where the other aspect of the first amendment comes in, freedom of association. we also want to respect the ability of users who come on parler to rate their own experience, securing their own feeds, so a lot of people do not want to see the trolls. they do not necessarily want to see what people called hate speech. they do not want to see attacks on the base of race, sex, sexual orientation, or religion. we like to work with a vendor skilled with this, and give their tool to the individual user to decide whether to turn on a filter to this or not. they can turn it on in a way
3:13 am
that puts a splash screen over the content and then click on the content if they want to see if we are checking our work and being objective, but we want to let the individual user control whether they see that speech on parler. as to all platforms having free speech, some of that discussion can lead to what i am worried about, which is you are going to come in with a broad government regulation that is going to have government supervising what these platforms do. i would rather leave it to the free markets. at parler, we think in the long-term we are going to win with our free speech market, but let the market decide that -- free-speech model, but let that market decide that. host: what makes someone a troll? guest: trolling, again, we have a narrowly defined category for that, and it is attacks or someone is making attacks on the basis of race, sex, sexual
3:14 am
orientation, religion. these are ad hominem attacks, if you have taken your introductory logic class, and they don't add substance to productive discussion. many people complain, you go out there on the internet and it is the wild west, and people particular the complain about twitter with respect to that, but there are other places, as well, and people realize it does not help if your goal is to improve civil discourse and it is just logically irrelevant. yeah, it would be nice to have a tool to filter that out of your feed, and that is what we try to provide you with. host: stephen in edgewater, florida, is next. a democrat. good morning. caller: yes, my concern is, like i cannot go into walmart and do something dangerous, going to a sporting department, get a gun, ask for bullets, put bullets in
3:15 am
it, and do something dangerous, both myself and walmart would end up in a legal situation. i think it is basically the same thing that should apply to social media. if a group uses parler or facebook to organize something like january 6, i think you guys should be subject to legal spots ability, so that is why i agree with facebook's attitude on censoring or cutting off donald trump's facebook account to prevent this kind of dangerous stuff from going on. host: amy peikoff. guest: that is a lot. donald trump himself is not responsible for all of this content, first of all, i do not believe that. he would be welcome on parler.
3:16 am
second of all, with respect to facebook, if you read the news that has come out in the months since january 6, facebook had tons of this content on their platform. if you look at the indictments for people charged with crime, the connection to january 6, facebook is all over those documents. i am not going to do a number comparison because we are smaller platform but there was plenty of that content over there, and facebook has plenty of resources and has bragged and has used the word, have brian to about their ability to identify and filter hate speech. they seem to be asking congress to regulate it, which i think is wrong. justice clarence thomas made an interesting statement and in the statement, he argued for a narrower interpretation of
3:17 am
section 230. this is part of what i hope will come out of the trump lawsuit and be put into practice, without lawsuits that are an issue right now, i hope they listen to this. he talks about distributor liability that a platform like parler or like facebook could be held liable for their actions as a distributor of content. and the standard that he articulated and there is whether the platform or constructively new of this rights violating content. you have to know that as a social media platform, it is impossible to be perfect at this. that is another thing congress knows, it is impossible to be perfect at this. if they do put an impossible standard into the platform that gives them unlimited control, as part of the danger here, nonetheless, if you can show that facebook knew that content was on there, or constructive,
3:18 am
or they should have known given the resources available to them, they were being negligent somehow, and you should be able to hold them liable and that is part of justice thomas's relations with respect to section 230. host: in respect to the hearings where jack dorsey and mark zuckerberg testified, or you invited to the hearings? how much access have you had to members of congress to make these recommendations that you have made this morning? guest: not at this point, we have not been invited. we would welcome that, of course. that is part of the argument, right, you have heard in hearing after hearing that the platforms are not doing enough to remove hate speech from the platform or doing enough to handle so-called misinformation. remember, with respect to the
3:19 am
information, stte of this is legal speech -- information, most of this is legal speech protected by the first amendment. the only people who should have control over that are purely private actors. when you have this immunity and you have congress saying, hey, you better act using the benefit of that immunity in the way we see fit, other wise we might take it away completely, you have a state action problem. host: belleville, new jersey, harry, republican. good morning. caller: good morning -- hi, i don't know if i am mixing up apples and oranges, but i used to work for at&t. i remember when the government broke up at&t as a monopoly.
3:20 am
facebook and twitter through trump off the air, and it is a private company, so i guess they have the right to do whatever they want, but at what point does facebook have this much power as far as if you're go into a restaurant, and you are an annoyance, they the right to throw you out. it is a private restaurant. but my point is can facebook ever be broken up, like at&t did, with a monopoly on speech, in other words? host: harry, things for the question. guest: there are a couple of different questions and there, so at what point is facebook not acting like a typical private entity anymore? that is at the root of this state action problem, and a case has been laid out where they talk about section 230 immunity,
3:21 am
which is in our view, overbroad, and in connection with it, we have this pressure coming from government bureaucrats and telling them to exercise their powers in ways they like and that they approve of, otherwise, the immunity navy taking away or they might be broken up. they might be forcibly broken up, and when the platforms go ahead and start to remove types of speech or specific targets in coronation with government bureaucrats, in accordance with government wishes, we have a hard time saying that is purely private action anymore. that is what is hopefully going to be fleshed out completely in the trump lawsuit. if the lawyers do a good job presenting that case, we can learn a lot and potentially solve the problem. host: charles and pennsylvania, in strasburg, pennsylvania,
3:22 am
democrat, good morning. caller: yeah, i have a question about free speech, is free speech, you mean that you can go on any venue like facebook, parler, and knowingly right a speech about stuff that is not true, like joe biden is a pedophile, what do you have about that? if you are just lying or making up things, how can that be free speech when you are just damaging people by saying stuff that is not true? is that good for our country? it is like standing up in a theater and yelling fire. if you are basically lying -- host: got your point, charles. amy peikoff. guest: charles, you are right in the sense that this is not speech that should be protected by the first amendment, and it would be called liable or
3:23 am
defamation, and then the issue is, who would, legal liability, who would be -- excuse me, i have liable and liability, and it is early. who would be legally liable for the libel in that case? it is the individual user who should be held liable for any defamation or libel they post on the platform, that is legal speech and it is the infringement of rights on others, and we have always supported civil liability for those cases. section 230 talks about the fact that if all business is doing is providing a platform and they themselves are not creating the libelous contents, then they should not incur liability for it. host: about 10 minutes left with amy peikoff, a parler chief officer, parler.com if you would like to check them out. democrats, (202)-748-8000.
3:24 am
republicans, (202)-748-8001. independents, (202)-748-8002. i want to come to this article from cnbc, news from last week, house republicans laying out their antitrust agenda for tech giants, is the headline. this is the statement from house minority leader ted mccarthy in a letter to republican colleagues about the framework they laid out, saying it would rain in big tech and their abusive practices, including changing the law so americans can challenge big tech directly further infringement of public speech rights. it starts by taking away the liability shield big tech has been behind for too long. section 230's decency act would be changed to limit liability protection from moderation of speech not protected by the first amendment and would preclude big tech from disseminating against americans based on their political affiliation. it would also require regular reauthorization of section 230,
3:25 am
so congress may update regulations of the constantly evolving internet landscape a lot there, and anything you agree with? guest: i do agree that this is a real problem, and i do agree it is a censorship issue. so some sort of government response would be necessary to it, but i do believe the response they are crafting there, that you just read, has that danger of creating a further entanglement between big tech and government, which is the problem we had in the first place. we have big tech censoring content at the behest of government bureaucrats. the problem is only going to get worse over the types of provisions you have outlined. may think, well, these bureaucrats who are in charge of
3:26 am
reauthorization every so often, that those people will not abuse their power, and they are just going to go ahead and do it in the spirit of the first amendment like at parler, but government power seems to have this way of creating the potential for corruption. a lot of times, to bureaucrats charged with this don't get a lot of checks and balances against what they do. moreover, it is just wrong. i am a capitalist. i voted libertarian the last couple of elections, and i believe a truly private business as far as acting in a private way, should be less free to make decisions about running the business themselves. and the principal at the root of section 230 is a proper principal. it is something they have a right to. it should not be the case for someone providing a mere platform for speech and is not
3:27 am
knowledgeable about or has no duty to have known the exact content of the particular post or whatever. that person should not be legally liable. so the core principle of section 230 is correct, and the idea that government has the power to everything taketh away and take winners and losers accordingly, that is one of the problems, one of the actors doing evil. ted cruz has gotten up a number of times and said they are given this immunity. that thing they are given potential he is an overbroad community, the thing they might be given depending on how the section is worded or interpreted in the overbroad community that has done some wrong, but that can be corrected. justice thomas seems to believe it can be corrected merely by reinterpreting section 230 properly. i would go that route before. host: are we going to hear more
3:28 am
from the supreme court on this in the upcoming supreme court session? guest: i do not know if there is a particular case coming up on the docket. i am sorry about that. i am very busy with my stuff at parler, but i just heard of a case recently in which i think it was only at the district court level, in which they are allowed to bring a case against facebook for its role in allowing trafficking content on its platforms. this is a case that has the potential to call into question the interpretation of section 230. we will see it but it is a matter of when. host: in michigan, republican, good morning. caller: good morning. thank you for letting me have a phone call. right now, the government and the platforms are arguing in congress and trying to divide up the power lines, and their struggles like you talk about.
3:29 am
my question is this, if you look forward to the next 20 years, do you see the possibility of the day where the government comes in and shuts down the platform like they did in cuba? i will take your answer offline. guest: you know, sure, shutting them down entirely is a possibility. as it is right now, i think the government is very happy to have some open so they have the ability to push leverage with respect to the publishing content and also, as i said, under the third-party doctrine, which has not yet been -- have not had the carpenter role expanded to address the third-party doctrine as such. the third-party doctrine gives them access to the user data without a warrant. they do not have to have probable cause in order to access user data, so i do not know that they want to shut them
3:30 am
down, they want to keep them on, like the screens in 1984 and have that flow of information back and forth. host: do you have parler users in cuba? guest: i do not know that we have parler users in cuba. i know we have some in brazil, the u.k., and other places, i am not sure if that is legal. host: what happens if you have ever experienced a foreign government trying to dictate what can be on parler guest: guest:, cannot be? not at this point -- parler, cannot be? guest: not at this point. interesting question. host: have you thought about it? guest: i am trying to cross each bridge as i come to it. host: connie in new jersey, democrat. good morning. connie, you with this? medford, new york, then, paul, an independent. good morning. caller: good morning. amy, you are a breath of fresh air. i want to say from january on to
3:31 am
the current time, i have been suspended from facebook for telling the truth. they do not like the truth. they accuse me of being a bully, and i would like to hear comments about that. guest: ok, so bullying is another category some platforms try to do something about. we on parler will leave it to the individual user. so we do have a different policy in that regard. i would assume that many of the users on parler, if you are a bully, they would lock you, a mute you, put on the troll filter, and maybe you would lined up behind that and you would never see it, but again, what we believe is all legal speech should either be in the
3:32 am
feed or not in the feed according to the decision of the individual user. the individual user should not be outsourcing their thinking to anyone, no matter how many layer of oversight boards facebook would create. it is still people outsourcing their critical thinking to other people and at parler, we think that is wrong. host: we have about three minutes left in the program. i would let you respond to this comment on twitter, dog canyon writing "social networks was not be designed as arbiters of truth." guest: and i agree. that is the point i just made, so navy my ear was burning or something, but, yeah, no, we do not take it upon ourselves to think for you. we encourage you to think for yourselves. we offer you tools that we think people who are thinking critically would use, but the decision is up to you. host: it is parler.com, amy peikoff, chief policy officer nl
3:33 am
3:34 am
3:35 am
legislation to protect the right to vote.

20 Views

info Stream Only

Uploaded by TV Archive on