In late January, a volunteer at the help line for the National Alliance for Eating Disorders fielded a call from someone who had seen an alarming trend on TikTok. The hashtag #legginglegs had started taking off as users posted about the slim bodies seemingly deemed the most desirable for leggings.
The organization, which works directly with social media companies including TikTok, Meta and Pinterest, quickly flagged the trend to TikTok. Less than a day later, the platform banned the hashtag and began directing users who searched for it toward the organization’s hotline and other resources.
Trends like “legging legs” are part of a long history of harmful body image content that has proliferated online since the early days of the internet. “The minute they ban one hashtag, another one will pop up,” said Amanda Raffoul, an instructor at Harvard Medical School and Boston Children’s Hospital who studies eating disorders. But she and other eating disorder experts said that the evolution of social media platforms has presented an even more vexing issue: how to approach algorithms that build on a user’s interests to curate a feed that can quickly turn dangerous for people who are particularly vulnerable.
For example, if a teenager searched for healthy snack ideas or interacted with certain cooking posts, a platform may then serve videos about low-calorie foods. That may signal an interest in weight loss — and soon, that teenager might see advice for restricting snacks or tips for crash diets. A user’s feed could then be filled with posts supporting unhealthy behaviors, or celebrating one body type over others.
“I don’t think we are in a space where we can ignore the harms that can be happening in the algorithm,” said Johanna Kandel, the chief executive officer of the National Alliance for Eating Disorders. “The fact is that individuals can start a journey of health and wellness and within a few minutes can be served content that is extremely unhealthy for them.”
For the companies tasked with policing these posts, experts said, it presents challenges that go beyond concerns about freedom of speech. There isn’t a clear distinction between sharing a story about recovery from an eating disorder and posting content that could trigger disordered eating, or between posting healthy recipes and encouraging eating behaviors that could be harmful for some adolescents or others who may already be struggling with body image or eating disorders.
TikTok did not respond to a request for comment. Ms. Kandel, who works with social media companies including TikTok to address harmful content, said she does not recall a potentially harmful trend being identified and taken down as quickly as TikTok removed the legging legs hashtag.
For years, public health experts have worried about the role that social media can play in developing eating disorders and other mental health issues. More than 29 million Americans have a clinically significant eating disorder in their lifetime, and people of any age, race, gender or body type can develop eating disorders, according to the National Alliance for Eating Disorders.
But with algorithms now determining more of your social media feeds than perhaps ever before, it has become even more challenging to cultivate a healthy relationship with what you see on social media, said Jillian Lampert, chief strategy officer for the Emily Program, which treats eating disorders. While people can choose whom they follow, their previous actions can determine what appears in other spots, like “for you” or “discover” pages.
As a teenager in the 1980s, Dr. Lampert said, she encountered messaging about the “ideal body” or extreme diets in magazines, movies and television shows. But, she said, it was nowhere near as inescapable as it is for young people today.
Dr. Lampert said that understanding how clients use social media — and what they’re seeing online — was a key part of the work at the Emily Program, which operates in-person treatment centers in four states and sees patients virtually. Because social media algorithms rely in part on who you follow, and what content you interact with, providers ask clients which accounts they might need to unfollow for their own well-being.
“We are re-establishing a different relationship with social media,” Dr. Lampert said.
The algorithms that can subtly serve up harmful content can also be a powerful weapon to combat it. When Emily Pearl, a social media consultant, saw “#legginglegs” on TikTok, her own outraged response countering the narrative found an enormous audience.
“Social can be so toxic, it actually blows my mind,” Ms. Pearl said in a video that has now been viewed more than 11 million times. Speaking over the phone, Ms. Pearl, 26, said that while she knew how to identify problematic content and cater her feed to some extent, she worried about her nieces and nephews and other young people who might not be aware of how harmful the videos they’re watching might be.
“There needs to be a safer way,” she said.