Readers' responses on Parler
Last week, I wrote a summary of the events leading to Parler's shutdown, and received a lot of thoughtful responses about what happened, so here are some thoughts from DL readers on the Parler shutdown.
Readers were responding to these questions:
- Do you think the tech companies were right to shut down Parler?
- Do you think this is a “special case,” or is this setting a precedent for how tech companies will operate in the future?
Yes, they were right to do what they did. It was an emergency.
It is setting a precedent, but it can become a "special case" if government resumes its role. Angela Merkel is right; it is government's role to make the rules but effectively that responsibility has been abdicated to business just like in a lot of other areas. I think this article summarizes it really well - Reimagining Capitalism in the Shadow of the Pandemic (hbr.org)
It is a very slippery slope. How much is too much? How do we know when something crosses a line? Was Parler given a chance? Will others be given a chance? Will someone who’s protesting abortion rights, or protesting abortion rights protestors, be able to do so? Gun rights? Black Lives Matters?
I think the tech companies have to operate this way, and will. First, nearly all the management is anti Trump, anti conservative, anti traditional culture. They do not interact with people who have a different world view, set of experiences and ideas of what’s “natural.” 49% of America voted for Trump. Is there any college that’s prestigious where even 10% of the faculty share that point of view? Or 20% of the students? Second, they will operate this way because they have a lot of support, and the people who don’t like it can’t do anything.
I don’t know what Facebook, Twitter etc did this summer when BLM protestors or others who claimed to be associated with them were turning violent. But my gut instinct is not much. This is not to excuse Pres Trump’s actions. They really were very bad. But people find a way anyways, for example Podcasts, to share nasty stuff. I’d rather have nasty stuff visible where it can be rebutted, for example with ads.
I don't think it was ethical for big tech to take down Parler. I haven't used the app before, but if I am to go by your description it doesn't sound like Parler is any worse than Facebook in terms of extremist content. Still, I think these cloud/storefront providers have every right to turn customers away. If tech companies couldn't decide who they do and do not serve, or what they do or do not allow on their platforms, for me that would mess with a bunch of other IRL precedents as well. It's a new legal frontier for sure.
I don't think this is a one-time event. Twitter, for example, has been temporarily suspending other politicians' accounts as well. Who knows when we'll see another permanent ban. On the other hand, it could be a test that they will regret and change course in the future, being the agile enterprise that they are.
I've seen a lot of web content of people expressing their opinions about the limits of free speech, but rarely do people reference the long history of legal precedent on the subject. A concept I remember from elementary school is clear and present danger. In practice, courts tend to weigh the evil of censorship against the harm of whatever someone said and are being punished for. I think this is kind of what major tech companies are doing to respond to increasing instability in the world.
I agree with David Sacks and Naval. It was a mistake to shut down Parler, and the big tech companies are going to regret it as the government cracks down harder on technology platforms
Were tech companies right to shut down Parler? This is a slippery slope. How do we know when something crosses a line? Every company with a comments section potentially could have “hate speech” in the view of someone present on their site. One wonders how much is sufficient to merit a shut-down. One person’s hate speech is another person’s cry from the heart.
While we’d all like clear-cut guidelines, I think the tech platform companies have to operate with a degree of vagueness. They can’t claim to have perfect definitions, because there’s always something new that comes up. Buyer beware: you need to read the fine print from your providers and choose the one you’re most comfortable with. But also, my impression is that many managers and employees in tech management have a homogeneous cultural view that is different in many ways from the America of 30 years ago. Also there is likely some physical separation: the techies do not vacation in Branson, and may not be interacting much at all with people from middle America. 48% of America voted for Trump. That 48% seems to be rarely present when there are discussions about closing tech company platforms.
It is very hard to keep up with what the various platforms have done over the years. This summer, there was a lot of unrest in various cities. I did not get the sense Facebook was closing down groups or stopping people from using Messenger when they were participating in violent acts. One move I liked was Facebook posting public service and factual announcements about the election. I’d rather have nasty stuff visible where it can be rebutted than off shore on a Russian platform.
Yes, I think if someone violates the TOS then the company has a right to shut them off. This was just highly visible, but this happens pretty often, especially in the app store/marketplaces. Companies need to be able to regulate who they are doing business with. Although, personally I am a little torn, because I believe in free speech but just like you cannot yell "Fire" in a crowded building you should not be able to incite violence online.
I think it is business as usual. If you violate Apple's terms for the app store they remove you or refuse to publish your update. This area needs to be regulated, but if they (government) refuse to regulate "news" channels then I see this happening more often. People will speak with their checks or time spent using the application.