Meta confronted by Walmart, Match Group over placing ads next to illicit sex content: lawsuit

Meta faced an uproar from two of its most prominent corporate advertisers – Walmart and Tinder parent Match Group – after they caught wind that Instagram and Facebook were running ads next to content that sexualized underage users, according to an amended lawsuit filed Tuesday.

In one exchange from early November, Match Group told Meta it had observed a series of Reels that appeared alongside one of its ads that it described as “highly disturbing” – including six consecutive videos of young girls.

One video allegedly showed a “[y]oung girl provocatively dressed, straddling and caressing a Harley Davidson-style motorcycle” with the Judas Priest song “Turbo Lover” playing in the background, according to Match Group’s message.

The complaint said Match Group’s concerns about its ads were so elevated that its CEO Bernard Kim reached out to Zuckerberg directly, noting in a letter that his company was spending millions of dollars only for “our ads are being serviced to your users viewing violent and predatory content.”

Zuckerberg did not respond to the message, according to the lawsuit.

New Mexico filed suit against Meta in December. AP

The heated messages were unsealed as part of a bombshell amended complaint filed by the New Mexico attorney general’s office – which sued the Instagram parent last month for allegedly exposing children to adult sex content and contact from alleged child predators.

Prior to its CEO’s attempt to reach Zuckerberg, Match Group also had flagged concerns that one of its ads appeared on Facebook alongside “gruesome content” from a group titled “Only women are slaughtered.”

“We need to quickly figure out how we stop this from happening on your platforms,” Match Group said.

Meta responded by attempting to reassure Match about its brand safety practices and noting it had removed some of the disturbing content, including the Facebook group.

Critics allege Meta has failed to protect young users. Monkey Business – stock.adobe.com

Advertiser alarm allegedly grew after the Wall Street Journal reported that Instagram’s recommendation algorithms were signal-boosting  what it called a “vast pedophile network.”

Meta sought to soothe advertiser concerns about illicit sexual content, writing in a message to Walmart and other advertisers last June that it removes “98% of this violating content before it’s even reported.”

When Walmart asked Meta in October of last year why its ads were still running next to disturbing content, a Meta employee allegedly blamed it on searches by the Journal’s reporters and said his “gut feeling is that this was really bad luck.”

A “frustrated” Walmart employee reportedly replied that “bad luck isn’t going to cut it,” the suit said.

The dispute came to a head last November, after another Journal article revealed that a Walmart ad had run “after a video of a woman exposing her crotch.” Days later, a Walmart “marketing representative” blasted Meta for failing to adequately address the problem.

“It is extremely disappointing that this type of content exists on Meta, and it is unacceptable for Walmart’s brand to appear anywhere near it,” the Walmart employee said in the message. “As a longtime business partner, it was also very disappointing to learn about this from reporters, rather than from Meta.”

Walmart said in a statement that it “take[s] brand safety issues extremely seriously, and protecting our customers and communities will always be a top priority.”

Match declined to comment. Meta representatives did not immediately respond to requests for comment.

Meta has touted various safety tools to protect underage users. AP

“New evidence indicates Meta officials are duping corporate advertisers and permitting sponsored content to appear alongside deeply disturbing images and videos that clearly violate Meta’s promised standards,” New Mexico Attorney General Raul Torrez said in a statement.

Torrez also accused Meta boss Mark Zuckerberg and the company itself of “refusing to be honest and transparent about what is taking place on Meta’s platforms.”

The amended lawsuit included other alarming new details about the New Mexico attorney general office’s investigation – including forums within the “dark web” in which alleged sex predators have “open conversations about the role of Instagram’s and Facebook’s algorithms in delivering children to predators.”

“Predators discuss leveraging Instagram to search, like, and comment on images of children in order to get the algorithm to funnel similar images, videos, and accounts to their feeds, to identify groups of pedophiles and children, and to connect with potential child victims,” the suit alleges.

“Meta’s lack of accountability is so obvious that even child predators recognize how easy it is to prey upon children through Meta’s platforms,” a source within the New Mexico attorney general’s office said.

In its original filing from December, the attorney general’s office said its investigators had set up a series of test accounts to probe Meta’s ability to protect underage users from harmful content.

The test accounts, which depicted four fictional children using AI-generated photos that purportedly portrayed children aged 14 or younger, were purportedly bombarded with disgusting material – including “pictures and videos of genitalia” and messages from alleged sex predators.

New Mexico Attorney General Raul Torrez is pictured. AP

Aside from New Mexico’s lawsuit, Meta faces a sweeping lawsuit from dozens of state attorneys general who allege its social media platforms are addictive and exploit underage users for revenue – despite fueling negative outcomes such as depression, anxiety and body image issues.

That lawsuit also alleged that Meta has turned a blind eye to millions of underage users and collects their data without parental consent, in violation of federal law.

Meanwhile, Meta has repeatedly denied wrongdoing and touted dozens of safety features it has installed in recent years to protect youth.

Earlier this week, Meta said it was enacting new restrictions that would result in “safe, age-appropriate experiences” for young users.

The company said it is now “automatically placing teens into the most restrictive content control setting on Instagram and Facebook” and limiting search results for unsettling topics such as suicide, eating disorders and self-harm.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *