NAIROBI, Kenya- In a new move aimed at protecting teens from unhealthy body standards, YouTube is tightening the reins on its recommendation system for users aged 13 to 17.
The platform will limit automatic suggestions for certain health and fitness videos, especially those that idealize specific body types or promote extreme fitness levels.
While teens will still be able to search and watch these videos on their own, YouTube won’t push them towards similar content—a proactive step the platform says is meant to prevent the development of “negative beliefs” about self-image in young viewers.
YouTube’s algorithm is known for keeping viewers hooked by suggesting similar videos once they finish watching one.
For teens, however, this system has raised concerns about exposure to content that promotes unrealistic beauty standards and aggressive fitness goals.
In response, YouTube’s Youth and Families Advisory Committee found that repeated exposure to such videos may encourage teens to adopt harmful perceptions about their bodies.
Moving forward, YouTube will stop recommending videos that:
- Compare physical features and place one body type above others
- Glorify extreme fitness levels or certain body weights
- Display social aggression through non-contact fights or intimidation
While these changes are promising, they rely heavily on teens being logged into YouTube with an accurate date of birth. The platform currently lacks a robust way to verify the age users claim to be, making enforcement tricky.
Dr. Petya Eckler, a senior lecturer at the University of Strathclyde, applauded YouTube’s decision but emphasized that more needs to be done.
“There’s a well-documented link between social media use and how young people view their bodies,” said Dr. Eckler.
She believes YouTube’s actions should spark broader conversations around fitness, health, and self-esteem, especially within families. “Exercise should be about overall well-being, not just about appearance.”
This sentiment reflects growing concerns across social media platforms. Just last May, UK regulator Ofcom called on tech companies to adjust their algorithms to steer children away from “toxic” material online.
In addition to algorithm changes, YouTube is rolling out new tools for parents to keep an eye on their kids’ online activity.
Parents will soon be able to link their accounts with their teenagers, allowing them to monitor uploads, subscriptions, and comments.
They’ll also receive email notifications whenever their teen uploads a new video or starts a livestream.
These updates mark a significant shift for YouTube as it tries to balance its role as a content hub with the responsibility of safeguarding younger users.
With more teens spending countless hours on the platform, these changes couldn’t come at a better time.
As the conversation around social media’s impact on teen mental health continues, YouTube’s latest steps offer a glimpse into how platforms might evolve to better protect vulnerable users.