Instagram’s Shocking Discovery: Sexual Content Targeting Teen Accounts!

Late in 2024, Meta launched Instagram Teen accounts to create a safer online space for young users. This initiative aims to shield teens from inappropriate content and promote secure interactions, using technology to verify ages. These accounts are set to private by default, block messages from unknown users, and hide offensive language.
However,a recent study by the youth advocacy group Design It For Us and Accountable Tech reveals that these safety measures may not be effective. Over two weeks, five teen test accounts were monitored, and all encountered sexual content despite the platform's assurances.
Inappropriate Content Flood
The test accounts received unsuitable material even with the sensitive content filter activated. The report noted that four out of five teen accounts were recommended posts related to body image issues and disordered eating habits.
Additionally, 80% of participants reported feeling upset while using their Instagram Teen accounts. Only one account displayed educational images or videos among the overwhelming amount of inappropriate content.
One tester shared their experience: “About 80% of my feed was filled with relationship advice or crude jokes about sex. While it wasn’t explicit or graphic in nature, it left little to the creativity.”
According to the detailed report spanning 26 pages, an alarming 55% of flagged posts depicted sexual acts or imagery. Some videos gained immense popularity on the platform; one even amassed over 3.3 million likes.
Harmful Messages Abound
with millions of teenagers on Instagram automatically assigned Teen Accounts, we sought clarity on whether these measures truly enhance safety online.
The algorithm also promoted harmful ideas about “ideal” body types alongside body shaming and unhealthy eating practices. Disturbingly common were videos encouraging alcohol use and also those suggesting steroids for achieving a certain physique.
A Mix of Negative Media
Despite Meta’s claims about filtering harmful material for younger audiences, test accounts still encountered racist, homophobic, and misogynistic content—again receiving millions of likes collectively. Videos depicting gun violence and domestic abuse also surfaced in these teen feeds.
the report highlighted that some test Teen Accounts lacked Meta’s standard protections entirely; none received controls against sensitive content while some missed safeguards against offensive comments altogether.
This isn’t an isolated incident; previous leaks have shown how Meta was aware of Instagram's negative effects on young girls struggling with mental health issues related to body image back in 2021.
In response to this latest investigation shared with The Washington Post, Meta dismissed its findings as flawed while minimizing concerns regarding flagged materials. Just recently they expanded protections for teens across Facebook and Messenger too.
A spokesperson stated: “A manufactured report does not change the fact that tens of millions of teens now have a safer experience thanks to Instagram Teen Accounts.” They did acknowledge they would investigate concerning recommendations made by their algorithms further.