- December 26, 2024
Loading
The U.S. Supreme Court will hear a case this year on Florida Senate Bill 7072, the 2021 state law regulating social media platforms. It’s yet another example of government reaching too far to squelch free speech.
At the time the bill passed, Gov. Ron DeSantis and legislative leadership lauded it as taking back the public square and standing up to big tech censorship. DeSantis said when he signed the bill, “Florida is taking back the virtual public square as a place where information and ideas can flow freely.”
The law prohibits privately owned social media platforms from removing content based on viewpoint. Texas’ Legislature passed a similar bill, which the Supreme Court is also reviewing.
The Supreme Court’s opinion in the cases — Moody v. NetChoice and NetChoice v. Paxton — is expected to be highly consequential for the future of online speech, freedom of association and content moderation.
Reason Foundation and several co-signers submitted an amicus brief to the Supreme Court, agreeing with NetChoice that the Florida and Texas laws violate the First Amendment.
We argue the editorial decisions and content moderation policies of social media platforms, large or small, are exercises of the freedom of speech, freedom of the press and freedom of association protected by the First Amendment. Efforts to force such companies to convey or associate with viewpoints or persons with which or whom they disagree or otherwise choose from whom to disassociate themselves clearly violate the First Amendment.
The Texas and Florida laws passed when state officials were concerned that Facebook and Twitter (now X) were removing content that expressed conservative views, particularly on the controversial issues of the COVID-19 pandemic responses and the validity of the 2020 election results. But the critical First Amendment issues upon which the court will rule are not related to the nature of these specific views or controversies or anyone’s right to speak about them.
Instead, the substance of the court cases focus on the rights of private organizations to build audiences and users by facilitating and moderating — many also would say censoring — content in ways the users find beneficial. Supporters of the state laws and lower court rulings repeatedly lose track of this distinction.
The decision of whom and what content to include or exclude on a social media platform is fundamental to the First Amendment rights of those organizing the platform and the individuals freely choosing to associate or not with that platform.
The rise of social media in the 21st century has combined the ways we communicate and acquire information from media sources in novel and sometimes disorienting ways. But the vast body of precedent from previous cases shows that this disruptive process and its propensity to raise legal questions are far from new.
Courts have decided related questions in cases involving, but not limited to, print outlets, broadcast and cable television media, public utilities, political advocacy groups, religious organizations and shopping malls. They have found in favor of these private organizations’ rights to include and exclude content and speakers with striking reliability.
Moreover, decisions by social media platforms to selectively include some speech or speakers and selectively exclude other speech or speakers fall under the rubric of protected speech of the editor or owner of the platform. That’s what news editors do every day — selectively decide what to print or air and what not to.
Critics of tech companies often present social media as having little diversity and being susceptible to the whims of a few corporations. But this is a restrictive and misleading view of the still-expanding array of online choices and the different ways social media communities are and can be arranged.
Some platforms are more pluralistic and diverse than others, but that does not mean they must tolerate any and all persons or speech. And they are free to organize and prioritize how content is presented on their platform in ways they see fit.
Keep in mind they are competing in a market and are trying to please their customers, who ultimately decide the platform’s fate. That is the freedom of association that is protected by the First Amendment.
What’s more, the so-called common carrier doctrine, used at times to require government-regulated utilities and broadcasters to accommodate minority viewpoints, makes no sense in a dynamic technological environment that features firms with temporarily large market shares but an unprecedented array of outside options where individuals can speak.
Critics’ overly homogenous and static view of social media may have contributed to the misguided idea that the largest platforms should be declared common carriers. But such a policy would create perverse incentives punishing success. What’s more, how to implement such rules practically strains common sense.
The temporary dominance of one or two platforms does not at all impede new competing platforms or other methods of exercising individual speech.
As we said in our brief: “(T)he shifting choices and politics of different platforms create a constant churn that makes arguments based on market dominance, essential facilities or other antitrust analogies largely absurd.
“Twitter has transformed into X, with different views on content moderation, and new social media platforms come and go, with something for everyone, whether it is the proliferation of Reddit subgroups, Truth Social, Instagram, Threads, Tumbler or even 4chan for those who prefer the Wild West of social media.”
Finally, governmental benefits provided to corporations and interactive platforms (including Section 230, which shields platforms from some liability from speech by third parties) do not require these organizations to forfeit other First Amendment rights.
The protections afforded by Congress to digital platforms from liability from some third-party speech — aka Section 230 of the Communications Decency Act — remain a separate legal matter. Those protections in no way require platforms to forfeit other First Amendment rights.
Congress as a deliberate matter of policy enacted Section 230 to clarify liability for social media platforms and their users for speech shared on those platforms. Those policies were not and cannot be conditions that sacrifice First Amendment protections.
Again, in our brief we explain the practical user experience of social media platforms and make clear the difference between Section 230 protections and the right to moderate content as a matter of common sense.
“The fundamental point of Section 230 is that platforms do not adopt the speech of others merely by transmitting it to others.” … “Certainly, by excluding some content or users, the platform is saying something about what is allowed to remain. But what they are saying may be ‘this is interesting,’ ‘you might want to see this,’ or ‘this is acceptably tolerable’ rather than ‘this is right’ or ‘I agree.’”
The Supreme Court should find Florida SB 7072 unconstitutional. Such a ruling would not make vigilance or concern over the moderation decisions of social media platforms unwarranted, as platforms could certainly still make bad or unfair moderation decisions. But that is their right. Our expanding social media environment provides private and individual solutions to these concerns.
Consider the plight of a group with strongly dissenting opinions — large or small, and irrespective of ideology. The country’s media have progressed through stages dominated by print, broadcast, cable television, online journalism and now social media. At almost every stage, the options open to such groups to speak to each other and make their views available to the world at large have grown in variety and shrunk in cost.
The opportunities for others to engage or exercise their right not to engage have expanded as well.
These positive trends are the result of an often messy and occasionally harrowing process that heavily favors the rights of individuals and groups to speak and decide to whom they will and will not speak. Trying to stop this process with politically motivated and short-sighted laws is counterproductive.
Adrian Moore is vice president of Reason Foundation and lives in Sarasota. Max Gulker is a senior policy analyst at Reason.