Instagram is recommending Reels with sexual content to teenagers as young as 13 even if they aren’t specifically looking for racy videos, according to separate tests conducted by The Wall Street Journal and Northeastern University professor Laura Edelson. Both of them created new accounts and set their ages to 13-years-old for the tests, which mostly took place from January until April this year. Apparently, Instagram served moderately racy videos from the beginning, including those of women dancing sensually or those that focus on their bodies. Accounts that watched those videos and skipped other Reels then started getting recommendations for more explicit videos.
Some of the recommended Reels contained women pantomiming sex acts, others promised to send nudes to users who comment on their accounts. The test users were also reportedly served videos with people flashing their genitalia, and in one instance, the supposed teen user was shown “video after video about anal sex.” It took as little as three minutes after the accounts were created to start getting sexual Reels. Within 20 minutes of watching them, their recommended Reels section was dominated by creators producing sexual content.
To note, The Journal and Edelson conducted the same test for TikTok and Snapchat and found that neither platform recommended sexual videos to the teen accounts they created. The accounts never even saw recommendations for age-inappropriate videos after actively searching for them and following creators that produce them.
The Journal says that Meta’s employees identified similar problems in the past, based on undisclosed documents it saw detailing internal research on harmful experiences on Instagram for young teenagers. Meta’s safety staff previously conducted the same test and came up with similar results, the publication reports. Company spokesperson Andy Stone shrugged off the report, however, telling The Journal: “This was an artificial experiment that doesn’t match the reality of how teens use Instagram.” He added that the company “established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months.”
Back in January, Meta introduced significant privacy updates related to teen user protection and automatically placed teen users into its most restrictive control settings, which they can’t opt out of. The Journals’ tests were conducted after those updates rolled out, and it was even able to replicate the results as recently as June. Meta released the updates shortly after The Journal published the results of a previous experiment, wherein it found that Instagram’s Reels would serve “risqué footage of children as well as overtly sexual adult videos” to test accounts that exclusively followed teen and preteen influencers.
This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.
+ There are no comments
Add yours