The Constitution, as well as most laws, and policies need updating from time to time. Why? because few lawmakers, contrary to what they claim, can see dozens of years into the future. Legislation, regardless of how high, or low on the governmental totem pole, is largely based on dealing with current, or near-future issues. And many, if not most, of the laws, policies, decrees, statutes, whatever, are 20, 30, 40, even 230 years out of date.
I am far from advocating pitching the Constitution. However, I think the time has come to take a good, hard look at what it has spawned and how it has been interpreted. The same goes for many laws across the country.
In particular, I would like to address the Communications Decency Act (CDA) of 1996 (or Title V of the Telecommunications Act of 1996). Now known as Section 230. While 1996 was not that long ago this section is a classic example of the inability to see into the future by legislators. This was long before the state of social media today and the current battleground between them and the legislature.
What is going on with social media today presents a real conundrum for the First Amendment – whether social media platforms are responsible for controlling (censoring) content. There is a huge argument that freedom of speech is to be respected, regardless of what the content is. This debate has raged for decades, if not centuries.
Then, of course, there is the argument around illegal or harmful content, lies and creating panic. Social media platforms should be able to censor content that is illegal elsewhere. In other words, things such as fighting words, obscenity, libel and slander, crime-related speech, threats, and some others are not protected under the First Amendment. Therefore, social media platforms have a fiduciary responsibility to adhere to what is protected and censor what is not.
Lately, there has been much discussion around what social media companies are responsible for. That is where Section 230 comes into play. What seems to have pushed this over the edge was Twitter’s suppression of a NY Post story about Hunter Biden. Spotify was also in the spotlight over censoring some content from an Alex Jones and Joe Rogan episode. There are other cases, as well.
The major discussion is if these platforms should be considered “editorial” for their role in controlling and censoring content. And, whether they exhibit bias in their implementation of censorship. In other words, are they publishers or platforms?
Section 230 makes the legal distinction between Internet publishers and platforms and protects the latter from liability for the content they host. Called into question is the degree of this protection.
Social media is a digital environment where people can freely interact in a variety of ways – discussions, images, video, and more. Twitter, Facebook, Google, et al, are, simply, platforms that accommodate this exchange or allow the dissemination of data. In effect, they are nothing more than enablers – or so they say.
That may have been their intent early on. However, lately, they have been acting more like censors than neutral hosts; therefore, the fracas over Section 230.
On the table is the issue of whether Section 230 should be amended to reflect the evolution of social media platforms. It is becoming increasingly evident that Section 230 needs a rework. It needs to have additional or changed wording that pertains directly to social media platforms.
However, there is an argument that social media platforms can censor whatever they want. Just like businesses can refuse service to anyone as long as it is not based on illegal reasons, such as race or sex.
If I, in my position as an editor for this eDigest, or in my university, or IEEE lectures, present some editorial damning of white supremacists, and a member of such a group, the Proud Boys, for example, want some counter-visibility, I am under no obligation to accommodate them. That is because I am a) working for a private company/university and b) they are not a legally protected class. As far as I know, Facebook and Twitter are private companies so they can do the same thing.
In the end, there is no doubt that social media companies should have some responsibility for content. However, unless the content originates with them, they are not publishers. Editing content is OK, even if the content originators are not happy with it. If it is placed on the platform willingly, it is subject to editorial scrutinization.
The real slippery slope is allowing Congress to make decisions on social media actions. The Internet Society recently cited YouGov research in which they reported that nearly 70 percent of Americans do not believe politicians have the knowledge or technical understanding needed to effectively regulate the Internet. This lack of technical understanding has given rise to potential legislation that can have serious implications for the Internet we all rely on.
The Business Insider reports that while both Senate Republicans and Democrats want Section 230 reform it is for vastly different reasons.
Generally, Republicans believe the current moderation has gone too far, and Democrats want more content throttled, according to the publication.
This has been a harrowing few months. After today’s election, either much will change, or nothing will change. In any event, we may have a clearer picture of the fate of Section 230.