Court cases test opposing views of online content moderation

Gopal Ratnam, CQ-Roll Call on

Published in Political News

WASHINGTON — Four judicial cases involving social media companies and online platforms will test the Supreme Court’s views on two competing claims: The companies either engage in too much content moderation or they do too little.

The high court has agreed to hear arguments in two of the cases and is weighing whether to take up the other two. Whatever its decisions, the outcomes could dramatically change how social media companies and online platforms operate, especially in light of a stalemate in Congress, where lawmakers of both parties want to amend online content moderation laws but can’t agree on how to do so.

The court is hearing arguments this term in two cases asking whether social media companies can be held liable for third-party content on their websites. Those decisions could determine whether the companies can still count on a liability shield Congress provided in 1996 to nurture the fledgling internet. The cases test congressional intent in that law.

The court has yet to decide whether to take two other cases involving Florida and Texas laws instructing social media companies not to remove posts based on political views. The justices’ request in January that the Biden administration weigh in on the cases has many experts predicting the court will do so. The cases test the limits of the First Amendment.

“I think it’s relevant to note that one cluster of cases suggests that companies are doing too little, and another claiming they are doing too much,” said Matt Schruers, president of the Computer and Communications Industry Association.

The stakes in the cases are high: The media companies could have to change the functionality that makes them so easy to use; lawmakers could face new pressure to update legislation for an industry that is now entrenched, wealthy and divisive; and users could discover sites that no longer offer the content that underlies their popularity.


High court rulings that favor the companies in the content liability cases would affirm protections provided by Section 230 of the 1996 law against liability for content added by users, and do so for an industry that is many times larger and more powerful than it was more than 25 years ago. Both cases involve content related to terrorism acts far from the U.S.

But rulings that the companies are liable could effectively mean they shut down discussions on topics such as racism, abortion, gun ownership, the Holocaust and other topics, some of the most hotly debated issues in U.S. politics, according to legal experts.

Rulings in the Texas and Florida laws will tell the social media companies how far they can rely on First Amendment protection to restrict content. Because two Appellate courts have come to different conclusions over the state laws, even a Supreme Court decision not to take the cases would present the companies with a challenge.

“The last word on the constitutional question, on what the First Amendment protects, that’s going to come from the court because the court has the power to interpret the Constitution,” said Caitlin Vogus, deputy director of the free expression project at the Center for Democracy and Technology, a group in Washington and Brussels that uses technology policy to promote democracy. “But what Section 230 should protect and all of that … the last word does have to come from Congress.”


swipe to next page

©2023 CQ-Roll Call, Inc., All Rights Reserved. Visit cqrollcall.com. Distributed by Tribune Content Agency, LLC.


blog comments powered by Disqus