Troublesome report from The Wall Street Journal Through TikTok, delve into extreme weight loss challenges, purging techniques, the personal experience of young girls sent to the rabbit hole of a deadly diet, contribute to the development of eating disorders, or existing disorders It makes it worse. The WSJ We conducted our own experiments to see how TikTok’s algorithms could promote this kind of harmful content. The findings may explain TikTok’s sudden decision to change the way video recommender systems work.
As detailed in the report, WSJ Created over 100 accounts that “viewed the app with little human intervention”, 12 of which were 13-year-old bots who spent time on videos about weight loss, alcohol, and gambling. The graph included in the report shows that as soon as one of the bots suddenly stopped watching gambling-related videos and instead started spending time on weight loss videos, TikTok’s algorithm was adjusted accordingly. is showing. The number of weight loss videos that bots watched to explain this change in behavior quickly increased.
By the end of the experiment WSJ Of the 255,000 videos that the bot watched in total, 32,700 were found to contain descriptions or metadata that matched the list of hundreds of keywords related to weight loss. 11,615 videos had textual explanations that matched keywords related to eating disorders, and 4,402 had a combination of keywords that showed normalization of eating disorders.
Many of these videos reportedly used different spellings to eat eating disorder-related keywords to avoid being flagged by TikTok.later WSJ Warned the platform of 2,960 eating disorder-related video samples and removed 1,778 — WSJ It’s unclear if it was removed by TikTok or by the creators themselves.
Just one day ago WSJThe report has been deleted and TikTok has announced that it is working on new ways to prevent the formation of these dangerous rabbit holes. This change WSJ We contacted TikTok about an upcoming story, so it’s possible that TikTok preemptively published an update before the report was published.
In its post, TikTok acknowledges that displaying certain types of content over and over again, such as videos related to extreme diets and fitness, is not always healthy. We are currently working on ways to recognize if recommender systems are unintentionally serving videos that may not violate TikTok’s policies but may be harmful if over-consumed. .. The platform also states that users are testing tools to prevent videos containing specific words or hashtags from appearing on their For You pages.
“This experiment doesn’t reflect what most people have experienced with TikTok, but too many have that experience,” said TikTok spokeswoman Jamie Favazza. It says. The Verge.. “Educational or recovery-oriented content is permitted because we understand that it helps people feel hope, but content that promotes, normalizes, or beautifies eating disorders is prohibited.” The official also pointed out that Tik Tok provides access to the National Eating Disorders Association hotline within the app.
If you think you’re used to this situation, it’s because Instagram has already experienced it (and is still dealing with it). After whistleblower Francis Haugen leaked Facebook Papers, a collection that published Facebook’s internal documents, Instagram quickly worked to patch the holes in the sunken ship.
According to the paper, Facebook conducted its own research on the impact of Instagram on teens and found that the app could have a major impact on teens’ mental health and worsen the body image of young girls. did. About a month later, Instagram announced plans to introduce features that prevent teens from viewing potentially harmful content. We’ve also introduced a “break” feature that encourages users to close the app after spending a certain amount of time on the platform, such as 10 minutes, 20 minutes, or 30 minutes.