Skip to Content

TikTok may push potentially harmful content to teens within minutes, study finds


CNN, CNN BUSINESS

By Samantha Murphy Kelly, CNN Business

TikTok may surface potentially harmful content related to suicide and eating disorders to teenagers within minutes of them creating an account, a new study suggests, likely adding to growing scrutiny of the app’s impact on its youngest users.

In a report published Wednesday, the non-profit Center for Countering Digital Hate (CCDH) found that it can take less than three minutes after signing up for a TikTok account to see content related to suicide and about five more minutes to find a community promoting eating disorder content.

The researchers said they set up eight new accounts in the United States, the United Kingdom, Canada and Australia at TikTok’s minimum user age of 13. These accounts briefly paused on and liked content about body image and mental health. The CCDH said the app recommended videos about body image and mental health about every 39 seconds within a 30-minute period.

The report comes as state and federal lawmakers seek ways to crack down on TikTok over privacy and security concerns, as well as determining whether the app is appropriate for teens. It also comes more than a year after executives from social media platforms, TikTok included, faced tough questions from lawmakers during a series of congressional hearings over how their platforms can direct younger users — particularly teenage girls — to harmful content, damaging their mental health and body image.

After those hearings, which followed disclosures from Facebook whistleblower Frances Haugen about Instagram’s impact on teens, the companies vowed to change. But the latest findings from the CCDH suggest more work may still need to be done.

“The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health,” Imran Ahmed, CEO of the CCDH, said in the report.

A TikTok spokesperson pushed back on the study, saying it is an inaccurate depiction of the viewing experience on the platform for varying reasons, including the small sample size, the limited 30-minute window for testing, and the way the accounts scrolled past a series of unrelated topics to look for other content.

“This activity and resulting experience does not reflect genuine behavior or viewing experiences of real people,” the TikTok spokesperson told CNN. “We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need. We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”

The spokesperson said the CCDH does not distinguish between positive and negative videos on given topics, adding that people often share empowering stories about eating disorder recovery.

TikTok said it continues to roll out new safeguards for its users, including ways to filter out mature or “potentially problematic” videos. In July, it added a “maturity score” to videos detected as potentially containing mature or complex themes as well as a feature to help people decide how much time they want to spend on TikTok videos, set regular screen time breaks, and provide a dashboard that details the number of times they opened the app. TikTok also offers a handful of parental controls.

This isn’t the first time social media algorithms have been tested. In October 2021, US Sen. Richard Blumenthal’s staff registered an Instagram account as a 13-year-old girl and proceeded to follow some dieting and pro-eating disorder accounts (the latter of which are supposed to be banned by Instagram). Instagram’s algorithm soon began almost exclusively recommending the young teenage account should follow more and more extreme dieting accounts, the senator told CNN at the time.

(After CNN sent a sample from this list of five accounts to Instagram for comment, the company removed them, saying they all broke Instagram’s policies against encouraging eating disorders.)

TikTok said it does not allow content depicting, promoting, normalizing, or glorifying activities that could lead to suicide or self-harm. Of the videos removed for violating its policies on suicide and self-harm content from April to June of this year, 93.4% were removed at zero views, 91.5% were removed within 24 hours of being posted and 97.1% were removed before any reports, according to the company.

The spokesperson told CNN when someone searches for banned words or phrases such as #selfharm, they will not see any results and will instead be redirected to local support resources.

Still, the CCDH says more needs to be done to restrict specific content on TikTok and bolster protections for young users.

“This report underscores the urgent need for reform of online spaces,” said the CCDH’s Ahmed. “Without oversight, TikTok’s opaque platform will continue to profit by serving its users — children as young as 13, remember — increasingly intense and distressing content without checks, resources or support.”

The-CNN-Wire
™ & © 2023 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN-Technology

Jump to comments ↓

Author Profile Photo

CNN

BE PART OF THE CONVERSATION

KVIA ABC 7 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content