TikTok Pushes Potentially Harmful Content To Users as Often as Every 39 Seconds, Study Says
TikTok recommends self-harm and eating disorder content to some users within minutes of joining the platform, according to a new report published Wednesday by the Center for Countering Digital Hate (CCDH). CBS News: The new study had researchers set up TikTok accounts posing as 13-year-old users interested in content about body image and mental health. It found that within as few as 2.6 minutes after joining the app, TikTok's algorithm recommended suicidal content. The report showed that eating disorder content was recommended within as few as 8 minutes. Over the course of this study, researchers found 56 TikTok hashtags hosting eating disorder videos with over 13.2 billion views. "The new report by the Center for Countering Digital Hate underscores why it is way past time for TikTok to take steps to address the platform's dangerous algorithmic amplification," said James P. Steyer, Founder and CEO of Common Sense Media, which is unaffiliated with the study. "TikTok's algorithm is bombarding teens with harmful content that promote suicide, eating disorders, and body image issues that is fueling the teens' mental health crisis." The CCDH report details how TikTok's algorithms refine the videos shown to users as the app gathers more information about their preferences and interests. The algorithmic suggestions on the "For You" feed are designed, as the app puts it, to be "central to the TikTok experience." But new research shows that the video platform can push harmful content to vulnerable users as it seeks to keep them interested. Further reading: For teen girls, TikTok is the 'social media equivalent of razor blades in candy,' new report claims
Read more of this story at Slashdot.