Meta to hide posts about suicide, eating disorders from teens’ Instagram and Facebook feeds

Mark Zuckerberg’s company is taking action after a whistleblower and state officials accused Meta of doing too little to stop harms to minors.

(Credit: Getty Images)

Meta is going to hide content about self-harm and eating disorders from teen users on Instagram and Facebook, amid mounting criticism that Mark Zuckerberg’s company is doing too little to prevent harm toward minors.

Meta is now rolling out the changes, which will limit the concerning content from reaching teen users on Instagram Feeds and Stories, “even if it’s shared by someone they follow.”

Join YouTube banner

With the exception of its recommendation algorithms, Meta previously allowed the self-harm and eating disorder posts to flow on Instagram and Facebook, saying doing so could help “destigmatize” such issues. But now Meta says such content “isn’t necessarily suitable for all young people.”

“Now, when people search for terms related to suicide, self-harm and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help,” Meta added. “We already hide results for suicide and self harm search terms that inherently break our rules and we’re extending this protection to include more terms.”

(Credit: Meta)

The other major change is that Meta is “automatically placing teens into the most restrictive content control setting on Instagram and Facebook.” The company has already been applying the setting to new teen users who join the platform. Now it’s preparing to roll out the restriction to all existing young users.

The result should “make it more difficult for people to come across potentially sensitive content or accounts in places like Search and Explore,” the company added.  That said, teen users will still be able to modify their settings. But even so, Meta plans on using notifications to nudge teen users to adopt a more private experience.

(Credit: Meta)

The changes will arrive months after a company consultant-turned-whistleblower accused Meta’s top executives of ignoring the online harassment that can be directed toward teens, partly by citing his own daughter’s experiences on Instagram.

In October, a group of US states also filed lawsuits, alleging the company has been downplaying the harmful effects of social media on teens while also taking little action to delete accounts that belong to underage users who shouldn’t be on the platform.

Join YouTube banner

Despite the legal battle and criticism, Meta pointed out in Tuesday’s announcement that the company has developed “more than 30 tools and resources to support teens and their parents.”

“We regularly consult with experts in adolescent development, psychology and mental health to help make our platforms safe and age-appropriate for young people, including improving our understanding of which types of content may be less appropriate for teens,” the company added. Teen users can expect the changes to full take effect in the coming months.

uk.pcmag.com

 

Related:

A judge has temporarily halted enforcement of an Ohio law limiting kids’ use of social media

 

 

 

 

 

 

Comments are closed.