Instagram Accused of Pushing Disturbing Content if Adult Users Follow Children
Tests showed that Instagram, the popular social media platform owned by Meta, offered sexual content mixed with advertisements from major corporations to users who followed young people’s accounts on the platform.
Test accounts that followed “young gymnasts, cheerleaders and other teen and preteen influencers” received what The Wall Street Journal (behind a paywall) called “jarring doses of salacious content,” including videos of children and “overtly sexual adult videos.”
The Journal noted that many young people were followed on Instagram by adult men and that some of those men’s accounts also followed accounts that offered blatantly sexual content.
Following some of those men led Instagram’s algorithm to offer up even “more-disturbing content,” the outlet reported, alongside advertisements from companies that supposedly do not allow their ads to support such content.
The Canadian Centre for Child Protection obtained similar results when it ran similar tests, the Journal said.
The Journal offered two specific, and somewhat disturbing, examples of its findings.
“In a stream of videos recommended by Instagram, an ad for the dating app Bumble appeared between a video of someone stroking the face of a life-size latex doll and a video of a young girl with a digitally obscured face lifting up her shirt to expose her midriff,” the Journal reported.
“In another, a Pizza Hut commercial followed a video of a man lying on a bed with his arm around what the caption said was a 10-year-old girl,” it added.
Meta claimed that the Journal’s tests were somehow not representative of actual user experiences, although the outlet appeared to have set up accounts to mimic the actions of actual users.
A company spokesman also claimed that Instagram either inhibits the distribution of or downright removes as many as four million videos every month because of suspected content violations, giving rise to the question of how much more content remains on the platform.
Even if Instagram’s review process were 99 percent effective, that would mean over 40,000 videos remain on the platform monthly despite the company’s best efforts.
Samantha Stetson, a Meta vice president, nonetheless claimed that “the prevalence of inappropriate content on Instagram is low,” the Journal reported.
Companies whose advertising appeared near salacious content had mixed reactions when contacted for content by the Journal.
Online dating company Match Group said that it was unsatisfied with Meta’s response to its complaints about the issue and had begun canceling some of its advertising last month. Bumble, which operates in the same industry, also suspended its advertising spend across Meta’s companies.
Walmart declined to comment when asked by the Journal, and Pizza Hut didn’t respond to the request.
Disney, however, told the Journal that it was working with the “highest levels at Meta” to address the problem, and men’s product company Hims said it had found Meta’s efforts to address the problem “encouraging.”
“Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions,” Meta’s Stetson told the Journal.
In the early afternoon Wednesday, Meta’s stock was down a little more than 5 percent for the day, despite the overall Dow Jones average being up slightly.
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.
Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.