TikTok’s algorithms are promoting videos about self-harm and eating disorders among at-risk teens, according to a report released Wednesday highlighting concerns about social media and their impact on young people’s mental health.
Center researchers for Combating Digital Hate have created TikTok accounts of fictional teenage characters in the US, UK, in Canada and in Australia. The researchers managing the accounts then “liked” videos about self-harm and eating disorders to see how TikTok’s algorithm would respond.
Within minutes, the wildly popular platform was recommending videos about weight loss and self-harm, including ones featuring images of models and ideal body types, pictures of razors and discussions of suicide.
When researchers created accounts with usernames that suggested a particular vulnerability to eating disorders, such as names that included the phrase “lose weight,” the accounts were fed more harmful content.
“It’s like being stuck in a hall of distorted mirrors where you’re constantly being told that you’re ugly, or that you’re not good enough, and maybe you should kill yourself… TikTok’s algorithms are literally injecting the most dangerous messages possible into young people.”
The algorithms of social media they work by identifying topics and content that interest the user, which is then sent to them as a way to increase their time on the site. But the critics of the social media they claim the same algorithms that promote content about a particular sports team, hobby or dance super popular may lead users to experience harmful content.
This is especially problematic for teenagers and children, who tend to spend more time online and are more vulnerable to bullying, peer pressure, or negative content about eating disorders or suicide, according to Josh Golin, executive director of Fairplay, a non-profit that supports child protection The largest online for children.
He added that TikTok isn’t the only platform that has failed to protect young users from harmful content and extensive data collection. “All of these damages are related to the model business”Gulen said.”The platform of social media in itself is not foreseen”.
In a statement from a company spokesperson, TikTok disputed the findings, pointing out that researchers didn’t use the platform as much as regular users, and said the results were skewed as a result. The company also added that the user’s account name shouldn’t affect the type of content the user receives.
Read More About: Technology News