Elon Musk’s X Loses Bid to Block California Law Over Content Moderation Transparency

[ad_1]

X Corp. has lost a bid to temporarily block a California law requiring social media companies to disclose its terms of service and submit semiannual reports to the state about how they moderate content.

U.S. District Judge William Shubb on Thursday denied X’s motion for a preliminary injunction, finding that the reporting requirements aren’t “unjustified or unduly burdensome within the context of First Amendment law.” While compliance may carry a substantial burden, he concluded that the mandated disclosures are “uncontroversial” and “merely requires” identification of existing content moderation policies.

X in September sued California Attorney General Rob Bonta after the passage of AB 587, which requires large social media companies to post their terms of service and submit reports of how their content moderation policies address hate speech or racism, extremism or radicalization, disinformation or misinformation, harassment and foreign political interference. It alleged that the law improperly compels speech in violation of the First Amendment and state constitution, among other arguments surrounding inference with editorial decisions. 

“The legislative record is crystal clear that one of the main purposes of AB 587 — if not the main purpose — is to pressure social media companies to eliminate or minimize content that the government has deemed objectionable,” the complaint stated.

In a ruling denying X’s motion for a preliminary injunction, the court said that the reporting requirements don’t run afoul of the First Amendment since they only require “purely factual” disclosures.

“The required disclosures are also uncontroversial,” Shubb wrote. “The mere fact that the reports may be ‘tied in some way to a controversial issue’ does not make the reports themselves controversial.”

The judge sided with the state that it met its burden of showing that the reporting requirements “reasonably related to a substantial government interest” in requiring social media companies to be transparent about their content moderation policies and practices. He said that the law is meant to allow users to “make informed decisions about where they consume and disseminate news.”

Arguments that the law is preempted by section 230 of the Communications Decency Act — Big Tech’s favorite legal shield, which has historically afforded firms significant legal protection from liability as third-party publishers — were rejected, per the order.

“AB 587 only contemplates liability for failing to make the required disclosures about a company’s terms of service and statistics about content moderation activities, or materially omitting or misrepresenting the required information,” Shubb wrote. “It does not provide for any potential liability stemming from a company’s content moderation activities per se.”

[ad_2]

Source link