Instagram Reels served ‘risqu footage of children’ next to ads for major companies: report
Instagrams Reels video feed reportedly recommends risqu footage of children as well as overtly sexual adult videos to adult users who follow children with some of the disturbing content placed next to ads from major companies.
In one instance, an ad promoting the dating app Bumble was sandwiched between a video of a person caressing a life-size latex doll and another clip of an underage girl exposing her midriff, according to the Wall Street Journal, which set up test accounts to probe Instagrams algorithm.
In other cases, Mark Zuckerberg’s Meta-owned app showed a Pizza Hut commercial next to a video of a man laying in bed with a purported 10-year-old girl, while a Walmart ad was displayed next to a video of a woman exposing her crotch.
The shocking results were revealed as Meta faces a sweeping legal challenge from dozens of states alleging the company has failed to prevent underage users from joining Instagram or to shield them from harmful content.
It also comes on the heels of dozens of blue-chirp firms pulling their advertising from Elon Musk’s X platform after their promos appeared next to posts touting Adolf Hitler and the Nazi party. The exodus is expected to reportedly cost the site formerly known as Twitter as much as $75 million in revenue this year.
Meta now faces its own advertiser revolt after some companies cited in the study suspended ads on all its platforms, which include Facebook, following Monday’s report by the Journal.
The Journal’s test accounts followed only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform.
Essential weekly read to fuel business lunches.
Please provide a valid email address.
By clicking above you agree to the Terms of Use and Privacy Policy.
Thanks for signing up!
Never miss a story.
“Thousands of followers of such young peoples accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults, the outlet found.
The Reels feed presented to test accounts became even more disturbing after the Journals reporters followed adult users who were already following children-related content.
The algorithm purportedly displayed a mix of adult pornography and child-sexualizing material, such as a video of a clothed girl caressing her torso and another of a child pantomiming a sex act.
When reached for comment, a Meta spokesperson argued the tests were a manufactured experience that does not reflect the experience of most users.
We dont want this kind of content on our platforms and brands dont want their ads to appear next to it, a Meta spokesperson said in a statement. We continue to invest aggressively to stop it – and report every quarter on the prevalence of such content, which remains very low.
Our systems are effective at reducing harmful content and weve invested billions in safety, security and brand suitability solutions, the spokesperson added. We tested Reels for nearly a year before releasing it widely – with a robust set of safety controls and measures.
Meta noted that it has approximately 40,000 employees globally dedicated to ensuring the safety and integrity of its platforms.
The company asserted that the spread of such content is relatively small, with just three to four views of posts that violate its policies per every 10,000 views on Instagram.
However, current and former Meta employees reportedly told the Journal that the tendency of the companys algorithms to present child sex content users was known internally to be a problem even before Reels was released in 2020 to compete with popular video app TikTok.
The Journal’s findings followed a June report by the publication that revealed Instagrams recommendation algorithms fueled what it described as a vast pedophile network that advertised the sale of child-sex material on the platform.
That report prompted Meta to block access to thousands of additional search terms on Instagram and to set up an internal task force to crack down on the illegal content.
Nonetheless, several major companies expressed outrage or disappointment over the companys handling of their ads including Match Group, the parent company of Tinder, which has reportedly pulled all of its ads for its major companies from Meta-owned apps.
Most companies sign deals stipulating that their ads should not appear next to sexually-charged or explicit content.
We have no desire to pay Meta to market our brands to predators or place our ads anywhere near this content, Match spokeswoman Justine Sacco said in a statement.
Bumble spokesman Robbie McKay said the dating app would never intentionally advertise adjacent to inappropriate content and has since suspended advertising on Meta platforms.
A Disney representative said the company had brought the problem to the highest levels at Meta to be addressed, while Hinge said it will push Meta to take more action.
The Canadian Center for Child Protection, a nonprofit dedicated to child safety, purportedly got similar results after conducting its own tests. The Post has reached out to the group for comment.
Time and time again, weve seen recommendation algorithms drive users to discover and then spiral inside of these online child exploitation communities, the centers executive director Lianna McDonald told the Journal.