On vital importance of Section 230
Section 230 of the Communications Decency Act is what Trump’s recent executive order is about regarding censoring done by Facebook, Twitter, Google, etc.. I offer this VERY long post as a way of introducing Section 230 and why it’s important. This was originally going to be a comment on another thread, but it was too long to post.
0) Is there a difference regarding content censorship depending on whether the service is classified as a “platform” or a “publisher?”
1) You can read the entire text of 230 here https://www.law.cornell.edu/uscode/text/47/230. It’s VERY short. In fact, the salient portion of the law is 230c which you can read in about 8 seconds.
2) Platform versus provider. Do a search in section 230 for “platform.” You will find the term doesn’t show up. Many people seem to have a notion that a service is classified as a platform or a publisher. Classified by whom? To the best of my knowledge, there is no government agency that is set up to decide from on high to decide “You business XYZ are hereby classified a platform. You ABC are a publisher.” Indeed, as far as I am aware, the notion of a “platform” and what that would entail does not exist in the US code although many people seem to believe that it does. I am happy to be proven wrong on this fact of course.
3) What does section 230 actually say? 230(c)(1) says “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” OK. So this is very straightforward. If I provide information from someone else, I’m not the publisher of that information.
4) We need to pause here and ask: “Why does this law exist?” Many laws are “lessons learned in blood” as they say, and this is one of them. If you look at the history for section 230, its entire reason for existing is because of what happened in two famous cases involving Prodigy and Compuserve. Remember those? Let’s go back even further actually, to book stores. The law recognized that bookstores should not be held liable for libel because it would be impossible for a bookstore to know/vet the content of all the books it carries. The publisher of the book can be held liable, but not a bookstore. CompuServe was taken to court because someone posted something libelous on their service. The court said (basically), “CompuServe doesn’t do ANY kind of filtering on their service. Just like the bookstore, they don’t know what all the content is so they can’t be held liable.” Yay. Then what happened? Prodigy got sued for libel. Odd? Why would someone sue them when CompuServe was found not liable? Ah, because CompuServe was let off because the court said they didn’t moderate any content. Prodigy did. In that case, the court found that yes, by moderating the content and removing posts that Prodigy deemed to be “offensiveness and bad taste” they should be considered a publisher and be held liable.
4a) I want to pause here to emphasize this point and note what the world would look like today without 230c. If you had a blog, and you allowed comments on your blog, you would have two options: 1) allow EVERY comment. Blasphemy, pornography, and, yes libel ironically. Or, you could moderate your blog and remove the porn, blasphemy, personal attacks against your wife and family, etc. As soon as you did that though, you would immediately become responsible for the potentially illegal content on your blog, e.g., libel. Again, this is not a hypothetical bad scenario, this is quite literally the finding of the courts in the two cases.
4b) So this is the reason why 230 exists. The lawmakers said, “Wait a minute. This is crazy. We don’t want to disincentive people to filter out content that they believe is objectionable and thereby lead them open to libel. That’s NUTS.”
5) So what was their solution? A very straightforward law which says: computer services which show content from other people can’t be considered publishers of that content. Period. There’s no legalese here or emanating penumbras. It’s VERY straightforward.
6) The second paragraph of 230c says “No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected…”
6a) Note that things can be blocked by a service that are constitutionally protected speech. Why? Because the blocking is NOT being done by the government.
6b) Note that “otherwise objectionable” things can be blocked. This is VERY broad on purpose.
6c) “good faith” There are some in Congress focusing these days on those words and saying, “Ah ha! Big tech is blocking things that aren’t in good faith.” Or else, “They are blocking things that aren’t objectionable.” Therefore, “they must be held liable for this blocking!” Or else, “Therefore they are a publisher.” A couple thoughts on that.
6c.1) As to the latter “By doing this kind of blocking in bad faith you become a publisher” argument, that’s certainly not the case. 230(c)(1) is crystal clear, as we have already said, that providers are not publishers of other providers’ content. Full stop. There is nothing in the law anywhere that would obviate 230(c)(1) from always being in effect.
6c.2) As to the charge that you can be liable for blocking content in bad faith or that simply is not “objectionable” I would offer the following points. First, once again, it is of paramount importance that we remember the conditions under which the law was passed. The Prodigy case here looms large. Congress intended to say, “Hey courts. We are telling you flat out that providers are NOT publishers. Further, to make it VERY clear, you can’t be held liable by censoring objectionable content like Prodigy did.” Indeed, we can read the words of the author of this amendment from the floor of Congress “Mr. Chairman, our amendment will do two basic things: First, it will protect computer Good Samaritans, online service providers, anyone who provides a front end to the Internet, let us say, who takes steps to screen indecency and offensive material for their customers. It will protect them from taking on liability such as occurred in the Prodigy case in New York that they should not face for helping us and for helping us solve this problem. Second, it will establish as the policy of the United States that we do not wish to have content regulation by the Federal Government of what is on the Internet…” The second thing that should be pointed out is that it is a logical fallacy to claim that you are liable for blocking non-objectionable content or blocking in bad faith. The most you can say is that the law is silent on this topic. It’s fallacious to claim that A implies NOT B therefore NOT A implies B. In this case: Good faith blocking = Not Liable, therefore Bad faith Blocking = Liable. Of course not. If I pass a law that says playing in a public park is NOT trespassing, that therefore does not imply that NOT playing in a public park is trespassing. Or, suppose I own a farm and have fenced in a bull. A neighbor climbs the fence and gets gored by the bull. I am sued because my neighbor says, “The existence of the fence implies you knew the bull was dangerous!” A law might be passed that says, “You will not be held liable for putting up a fence to try and protect your neighbors.” This in no way implies that I will automatically be held liable for NOT putting up a fence.
On another thread on this topic, someone said “I regard the issue of Google (which is used by so many) controlling which site/pages users see in a search in order to wield political influence extremely troubling and problematic…” I agree 100%. But I think we can all agree we need to be VERY careful about calling for the government to have MORE power to punish corporations (and ultimately individuals) which is what weakening 230 would do. Remember what the authors of the amendment said “…it will establish as the policy of the United States that we do not wish to have content regulation by the Federal Government of what is on the Internet.” Well, if the government can punish a company for not regulating content in the way the federal government wants, isn’t that exactly what we end up with?
I understand the frustration with the Googles/Twitters/Facebooks, but let’s be careful we aren’t like William Roper in a Man For All Seasons and declare we would cut down every law to go after “Big Tech.” Because More’s reply was right, “Oh? And when the last law was down, and the Devil turned ‘round on you, where would you hide, Roper, the laws all being flat? This country is planted thick with laws, from coast to coast, Man’s laws, not God’s! And if you cut them down, and you’re just the man to do it, do you really think you could stand upright in the winds that would blow then? Yes, I’d give the Devil benefit of law, for my own safety’s sake!”