[ad_1]
New Mexico Attorney General Raúl Torrez filed an amended complaint to his lawsuit against Meta on Tuesday night with new allegations that it misled companies to believe that their ads were not shown alongside sexually explicit or violent content.
Torrez sued Meta in December 2023, alleging Facebook and Instagram are “breeding grounds” for predators targeting children for human trafficking, grooming and solicitation. Meta responded by noting that it has been proactive in finding and removing accounts and content that violate its child safety policies.
In its Business Help Center, Meta states that the company will “create safe and positive experiences for people and businesses who are advertising, monetizing, buying, and selling on our platform.” The lawsuit argues that these claims are deceptive.
“Meta is going to have an issue maintaining the safety of those brands because of their inability and their unwillingness to police the content on their site,” Torrez told NBC News.
The new filing details emails and letters between Meta and two advertisers. One of them, Match Group, which owns dating apps Tinder and Hinge, asked Meta for assurance that its ads would not appear alongside content aimed at minors or posts that were “graphic, sexually explicit, pornographic, or otherwise illegal,” according to the lawsuit.
In response, Meta described its policies and efforts to remove “bad actors” from its platforms, but Match found in November that its ads had appeared among videos of young girls, including one that was “provocatively dressed, straddling and caressing a Harley Davidson-style motorcycle.” Match is not a party to the attorney general’s lawsuit.
Match also told Meta that its ads appeared next to videos of women being murdered. Those videos were from a Facebook account that was “reported twice and has not been taken down,” according to the lawsuit. Meta took down the account after it was reported again.
The lawsuit states that Match CEO Bernard Kim reached out to Mark Zuckerberg directly and said, “Our ads are being serviced to your users viewing violent and predatory content.” Zuckerberg did not respond to Kim, according to the complaint.
Meta also received complaints from Walmart in early 2023 following various news reports that Facebook had shown ads next to inappropriate content. Despite assurances from Meta that the prevalence of illicit sexual content on its platforms was “extremely low,” an investigation from The Wall Street Journal in November 2023 found that Walmart ads were shown next to a video of a woman exposing her crotch.
“It’s pretty clear from those communications that Meta, frankly, is just not willing to engage in the kind of transformative work that they need to do to not only make their sites safe for children, but also make it a place for their business partners to do business and to advertise,” Torrez said.
In an email to NBC News, a Meta spokesperson disputed the claims and said the company does not “want this kind of content on our platforms and brands don’t want their ads to appear next to it.”
“We continue to invest aggressively to stop it — and report every quarter on the prevalence of such content, which remains very low,” the spokesperson wrote. “Our systems are effective at reducing violating content, and we’ve invested billions in safety, security and brand suitability solutions.”
The complaint brought against Meta by Torrez says that the company serves underage users sexually explicit material, leads them to unmoderated Facebook groups that facilitate commercial sex and allows the distribution of child pornography on its platforms. Meta has maintained that it directs significant resources and works with outside organizations in order to address child abuse content.
Zuckerberg, alongside the CEOs of X, Snap, TikTok and Discord, are expected to testify before the Senate Judiciary Committee about online child sexual exploitation on Jan. 31.
[ad_2]
Source link