In an attempt to keep teenagers safer while online, Meta announced this week that it will be further restricting content for teens that include suicide, self-harm, and eating disorders.
Meta, the parent company of Instagram and Facebook, shared in a blog post that content about these topics will be hidden from teenagers even in family or friend posts.
"We’re automatically placing all teens into the most restrictive content control settings on Instagram and Facebook and restricting additional terms in Search on Instagram," says Meta.
Meta hopes the ban will not encourage dangerous behaviour and will be directing all online searches with these keywords to healthy resources instead.
"While we allow people to share content discussing their own struggles with suicide, self-harm, and eating disorders, our policy is not to recommend this content, and we have been focused on ways to make it harder to find," says Meta. "Now, when people search for terms related to suicide, self-harm, and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help. We already hide results for suicide and self-harm search terms that inherently break our rules, and we’re extending this protection to include more terms. This update will roll out for everyone over the coming weeks."
A children's online advocacy group known as Fairplay is speaking up about this change and is critical about whether it's enough.
"A desperate attempt to avoid regulation & a slap in the face to parents who have lost their kids to online harms on IG. Congress should toss Meta’s press release in the recycling bin and pass the Kids Online Safety Act and COPPA 2.0.” Our statement here: https://t.co/PEtmAJ5CgA https://t.co/p80gNbfv1h
— Fairplay (@fairplayforkids) January 9, 2024
The changes will also only be in effect if everyone signing up to Facebook and Instagram is honest about their age.