Skip to content

Opinion Columnists |
Opinion: How Supreme Court should rule on Texas and Florida social media laws

If the states' laws are upheld, barring platforms from removing content will make the internet enormously worse

FILE – This combination of photos shows logos of X, formerly known as Twitter, top left; Snapchat, top right; Facebook, bottom left; and TikTok, bottom right. Social media companies collectively made over $11 billion in U.S. advertising revenue from minors last year, according to a study from the Harvard T.H. Chan School of Public Health released Wednesday, Dec. 27, 2023. (AP Photo, File)
(Associated Press)
FILE – This combination of photos shows logos of X, formerly known as Twitter, top left; Snapchat, top right; Facebook, bottom left; and TikTok, bottom right. Social media companies collectively made over $11 billion in U.S. advertising revenue from minors last year, according to a study from the Harvard T.H. Chan School of Public Health released Wednesday, Dec. 27, 2023. (AP Photo, File)
Author
PUBLISHED: | UPDATED:

The Supreme Court heard oral arguments recently in two cases that could have a profound effect on the future of the internet and social media.

The cases — NetChoice v. Paxton and Moody v. NetChoice — involve laws in Texas and Florida that prohibit social media companies from removing content from their platforms, clearly violating the 1st Amendment rights of private companies. If these laws are upheld, they will make the internet and social media enormously worse.

The Texas law bars social media platforms with at least 50 million active users — such as Facebook, X (formerly Twitter) and YouTube — from removing content based on the views expressed. The Florida law prohibits them from removing speech by political candidates and “journalistic enterprises”; it also requires them to notify users of any content moderation decisions and provide an explanation.

Texas and Florida adopted these laws based on a widely promoted but unfounded perception that social media platforms are more likely to remove conservative expression. Researchers have found no evidence to support this belief.

But even if there were a basis for concern, social media platforms — like all other media — have a 1st Amendment right to decide what speech to convey.

Half a century ago, in Miami Herald Publishing Co. v. Tornillo, the Supreme Court unanimously invalidated a Florida law that required newspapers to provide space to political candidates who had been attacked in print. The court emphasized that freedom of the press allows a newspaper to decide what to include and exclude.

The government can’t regulate speech on privately owned social media platforms any more than it can edit a newspaper. Several justices, including conservatives Amy Coney Barrett and Brett M. Kavanaugh, made similar points during the oral arguments.

The U.S. 11th Circuit Court of Appeals declared the Florida law unconstitutional on this basis. It also found that requiring a justification to be provided for every decision to remove material would make content moderation impossible. In considering the Texas law, however, the 5th Circuit Court of Appeals ruled that social media companies are, like phone companies, “common carriers” and can therefore be prevented from removing content.

The problem with this argument is that social media platforms are not and never have been common carriers that simply transmit everything that is posted. Nor would anyone want them to be.

Social media platforms constantly remove awful content. Facebook removes 3 million pieces of hate speech a month, an average of more than 4,000 per hour. And yet no reasonable person would accuse Facebook of being too effective at removing such speech.

Fortunately, social media companies remove a wide array of awful expression, including violent and sexually explicit content, much of it protected by the 1st Amendment.

Underlying the two cases heard by the Supreme Court is the broader question of whether state governments should regulate the content of social media and other online platforms. Many states, including California, have in recent years adopted a plethora of laws trying to control these media. But the platforms are national and indeed international, making it undesirable to subject them to countless regulations by individual states.

The internet and social media have changed the very nature of speech by making it possible for anyone to speak immediately to a mass audience. The downside is that their speech can be hateful, harassing, false and harmful in other ways. One approach to this problem is extensive government regulation of what appears on social media. That would clearly violate the 1st Amendment, however, and we all should be concerned about giving government such power to regulate what we see and hear.

An alternative is to prohibit content moderation, requiring social media platforms to carry everything unless it falls into narrow categories of speech that is not protected by the Constitution. That is what Texas and to a lesser extent Florida are trying to do. But these laws also restrict the speech rights of private companies and promote even more hatred and violence on social media.

The best option is to leave content moderation to social media companies and encourage them to do a better job of it. This avoids the 1st Amendment problems of government regulation and the nightmare of unregulated social media. And that is the path the Supreme Court should take in the NetChoice cases by finding the laws in question unconstitutional.

Erwin Chemerinsky is a contributing writer and the dean of the UC Berkeley School of Law. ©2024 Los Angeles Times. Distributed by Tribune Content Agency.