Today, the Supreme Court will hear a case that will determine whether the government can communicate with social media companies to flag misleading or harmful content to social platforms—or talk to them at all. And a lot of the case revolves around Covid-19 conspiracy theories.
In Murthy v. Missouri, Attorney Generals from Louisiana and Missouri, as well as several other individual plaintiffs, argue that government agencies, including the CDC and CISA, have coerced social media platforms to censor speech related to Covid-19, election misinformation, and the Hunter Biden laptop conspiracy, among others.
In a statement released in May 2022, when the case was first filed, Missouri Attorney General Eric Schmitt alleged that members of the Biden administration “colluded with social media companies like Meta, Twitter, and Youtube to remove truthful information related to the lab-leak theory, the efficacy of masks, election integrity, and more.” (The lab-leak theory has largely been debunked, and most evidence points to Covid-19 originating from animals.)
While the government shouldn’t necessarily be putting its thumb on the scale of free speech, there are areas where government agencies do have access to important information that can—and should—help platforms make moderation decisions, says David Greene, civil liberties director at the Electronic Frontier Foundation (EFF), a nonprofit digital rights organization. The foundation filed an amicus brief on the case. “The CDC should be able to inform platforms, when it thinks there is really hazardous public health information placed on those platforms,” he says. “The question they need to be thinking about is, how do we inform without coercing them?”
At the heart of the Murthy v. Missouri case is that question of coercion versus communication, or whether any communication from the government at all is a form of coercion, or “jawboning.” The outcome of the case could radically impact how platforms moderate their content, and what kind of input or information they can use to do so—which could also have a big impact on the proliferation of conspiracy theories online.
In July 2023, a Louisiana federal judge consolidated the initial Missouri v. Biden case together with another case, Robert F. Kennedy Jr., Children’s Health Defense, et al v. Biden, to form the Murthy v. Missouri case. The judge also issued an injunction that barred the government from communicating with platforms. The injunction was later modified by the 5th Circuit Court of Appeals, which carved out some exceptions, particularly when it came to third parties such as the Stanford Internet Observatory, a research lab at Stanford that studies the internet and social platforms, flagging content to platforms.
Children’s Health Defense (CHD), an anti-vaccine nonprofit, was formerly chaired by now presidential candidate, Robert F. Kennedy, Jr. The group was banned from Meta’s platforms in 2022 for spreading health misinformation, like that the tetanus vaccine causes infertility (it does not), in violation of the company’s policies. A spokesperson for CHD referred WIRED to a press release, with at statement from the organization’s president, Mary Holland, saying “As CHD’s chairman on leave, Robert F. Kennedy Jr. points out, our Founding Fathers put the right to free expression in the First Amendment because all the other rights depend on it. In his words, ‘A government that has the power to silence its critics has license for any kind of atrocity.’”