TikTok recommends adding eating disorder and self-harm content to new teen accounts in minutes, study finds | UK News

A study of TikTok’s video recommendation algorithm found that it recommended content about eating disorders and self-harm to some new teen accounts within minutes.

Research by the Center to Combat Digital Hate (CCDH) found that one account showed suicide content within 2.6 minutes, while another account showed eating disorder content within 8 minutes.

Further investigation by Sky News also found evidence of harmful eating disorder content being recommended through TikTok’s suggested search feature, despite not searching for explicitly harmful content.

British eating disorder charity BEAT said the findings were “extremely shocking” and called on TikTok to take “urgent action to protect vulnerable users”.

Content Warning: This article contains references to eating disorders and self-harm

TikTok’s For You page offers a collection of videos recommended to users based on the type of content they engage with on the app.

The social media company said recommendations are based on a variety of factors, including video likes, follows, shares and device settings such as language preference.

More on Data and Forensics

But some have raised concerns about the way the algorithm behaves when it recommends harmful content.

This is one of the videos suggested during the research.The text is as follows "she is thinner" music played in video says "I've been starving for you." Image: Center for Combating Digital Hate
This is one of the videos suggested during the research. The text reads “She’s Thinner” and the music played in the video reads “I’ve been starving for you”.In pictures: Combating digital hate centers via TikTok

CCDH opened two new accounts in the UK, US, Canada and Australia. For each, a traditional female username is given and the age is set to 13.

A second account in each country also contained the phrase “lose weight” in its username, a trait that separate research has shown accounts belonging to vulnerable users exhibit.

CCDH researchers analyzed the video content displayed on each new account’s For You page over a 30-minute period, interacting only with videos related to body image and mental health.

It found that the criteria for users was to provide users with videos related to mental health and body image every 39 seconds.

Not all content recommended at this rate is harmful, and the study did not distinguish between positive and negative content.

However, it found that all users received eating disorder content and suicide content, sometimes at a rapid rate.

The CCDH research also found that vulnerable accounts displayed three times as much such content as standard accounts, and that these accounts displayed more extreme content than standard accounts.

The Center to Combat Digital Hate found 56 hashtags related to eating disorder content. Thirty-five of these contained a high concentration of content promoting eating disorders.Figure: Douyin
The Center to Combat Digital Hate found 56 hashtags related to eating disorder content. Thirty-five of these contained a high concentration of content promoting eating disorders.Figure: Douyin

TikTok is host to an eating disorder content community that has amassed more than 13.2 billion views across 56 different hashtags, according to CCDH findings.

About 59.9 million of those views were on hashtags that contain a high concentration of videos in support of eating disorders.

However, TikTok said the activity and resulting experiences captured in the study “did not reflect the behavior of real people or the real viewing experience.”

FILE PHOTO: In this Jan. 6, 2020 illustration, the TikTok logo is displayed on a smartphone.REUTERS/Dado Ruvic/File Photo
TikTok bans eating disorder content and says it regularly removes content that violates its terms and conditions.Image: REUTERS/Dado Ruvic/File Photo

Kelly Macarthur has had an eating disorder since she was 14 years old. She has now recovered, but as a content creator on TikTok, she worries that some of its content may have an impact on those who are suffering.

“When I’m not feeling well, I think social media is a really healthy place where I can vent about my issues. But in reality, it’s full of anorexic material that gives you different cues and triggers,” she says told Sky News.

“I’m seeing the same thing happen to young people on TikTok.”

Further investigation by Sky News also found that TikTok suggested harmful eating disorder content in other areas of the app, despite not explicitly searching for it.

Sky News conducted its own research into TikTok’s recommendation algorithm using several different accounts. But instead of analyzing the For You page, we searched TikTok’s search bar for innocuous terms like “weight loss” and “diet.”

Sky News found searches such as "diet" Returns suggested searches related to eating disorder content.
Searching for terms such as “eating regression” suggested searches related to eating disorders, Sky News found.Figure: Douyin

Searching for the term “diet” on one account turned up another suggestion, “pr0 a4a”.

This is the “pro ana” code associated with pro-anorexic content.

TikTok’s community guidelines prohibit content related to eating disorders on its platform, and this includes prohibiting searches for terms explicitly related to it.

But users often tinker with terminology slightly, meaning they can continue to post about certain issues without being spotted by TikTok moderators.

While TikTok has banned the term “pro ana,” variations of it still pop up.

The screenshot on the left shows the suggested results for this term "lose weight". The screenshot on the right shows the suggested results when the first suggested result is clicked.
The screenshot on the left shows suggested results for the term “weight loss”. The screenshot on the right shows the suggested results when the first suggested result is clicked.Figure: Douyin

Sky News also found eating disorder content was easily accessible through TikTok’s user search function, although it was not explicitly searched.

A search for the term “weight loss” returns at least one account among its top 10 results that appears to be an eating disorder account.

Sky News reported the story to TikTok, which has since been deleted.

“It’s shocking that TikTok’s algorithm is actively pushing users towards damaging videos that can have devastating effects on vulnerable groups,” said Tom Quinn, director of external affairs at BEAT.

“TikTok and other social media platforms must act urgently to protect vulnerable users from harmful content.”

In response to the findings, a TikTok spokesperson said: “We regularly consult with health professionals to eliminate violations of our policies and provide support resources to anyone who needs them.

“We’re aware that trigger content is unique to each individual and continue to focus on creating a safe and comfortable space for everyone, including those who choose to share their recovery journey or educate others on these important topics .”

The Data and Forensics team is a multi-skilled unit dedicated to delivering transparent news coverage from Sky News. We collect, analyze and visualize data to tell data-driven stories. We combine traditional reporting techniques with advanced analysis of satellite imagery, social media and other open source information. Through multimedia storytelling, we aim to better explain the world while showing how our journalism is done.

Why data journalism matters to Sky News

Source link