TikTok searches pushed 13-year-olds toward pornographic content, report says
By Ana Nicolaci da Costa, CNN
(CNN) — A new report has found that TikTok has directed young users toward sexually explicit content through its suggested search terms, according to an investigation by UK not-for-profit watchdog Global Witness, as tech companies face mounting pressure to crack down on age verification.
As part of the investigation published on Oct. 3, Global Witness said it had set up 7 new TikTok accounts in the UK posing as 13-year-olds – the minimum age required for setting up an account – on factory-reset phones with no search histories.
Global Witness said TikTok’s search suggestions were “highly sexualized” for users who both reported being 13 years old and browsed the app using “restricted mode,” which “limits exposure to content that may not be comfortable for everyone” including “sexually suggestive content,” according to TikTok’s support page.
The report comes amid a broader push both in the United Kingdom and the United States to better protect children online, and as TikTok faces allegations in lawsuits filed last year that it’s harmful to young users’ mental health.
When CNN reached out about the report, a TikTok spokesperson said the company was committed to keeping their users’ experience safe.
“As soon as we were made aware of these claims, we took immediate action to investigate them, remove content that violated our policies, and launch improvements to our search suggestion feature,” the spokesperson said in a statement, adding that TikTok has “more than 50 features and settings designed specifically to support the safety and well-being of teens.”
The statement also said TikTok is “fully committed to providing safe and age-appropriate experiences” and that it removes “9 in 10 violative videos before they are ever viewed.”
Sexualized searches, however, were recommended “the very first time that the user clicked into the search bar” for three of the test accounts Global Witness created, according to the report. TikTok surfaced pornographic content to all seven test users “just a small number of clicks after setting up the account.”
“Our point isn’t just that TikTok shows pornographic content to minors. It is that TikTok’s search algorithms actively push minors towards pornographic content,” said Global Witness in its report.
TikTok’s community guidelines prohibit content containing nudity, sexual activity, and sexual services as well as any content containing sexually suggestive acts and significant body exposure involving youth.
TikTok said in its transparency report covering January through March 2025 that roughly 30% of the content removed from the platform because of a policy violation was taken down due to sensitive and mature themes.
TikTok removes around 6 million underage accounts globally every month by using various age detection methods, including technology to detect when an account may be used by a child under the age of 13, according to a spokesperson. It also trains moderation teams to spot signals that kids under 13 may be using TikTok.
The report comes after additional rules from the UK’s Online Safety Act pertaining to child safety went into effect in late July. Media lawyer Mark Stephens said in Global Witness’ report that the findings “represent a clear breach” of the act.
TikTok did not immediately respond to Stephens’ comment when asked by CNN.
The Online Safety Act 2023 is a set of laws that is meant to crack down on internet safety by enforcing new rules that require tech companies to regulate certain types of content, such as implementing age checks to prevent children from accessing content deemed harmful such as pornography and posts related to self-harm.
The act also applies to online platforms outside of the UK that have a large UK user base or are capable of being accessed by UK users. But critics of the act, such as the Electronic Frontier Foundation, have said the age verification rules could threaten privacy of all ages.
Global Witness said it conducted the first few tests before the UK’s Online Safety Act child safety rules fully applied to TikTok and other online platforms on July 25 and ran additional tests after that date.
TikTok approaches Online Safety Act compliance “with a robust set of safeguards,” the spokesperson said, having been regulated by UK communications services regulator Ofcom since 2020 under Ofcom’s Video-Sharing Platform regime, which includes provisions to protect those under 18 from inappropriate content.
TikTok has launched safety measures for teens in recent years, such as a “guided meditation” feature to help young users cut back on scrolling and the disabling of late-night notifications.
TikTok is one of many tech giants facing additional pressure to better protect children online. YouTube, for example, introduced a system in August that uses artificial intelligence to estimate a user’s age and turn on age-specific protections if necessary. Instagram implemented teen account settings that automatically make teen accounts private last year.
The-CNN-Wire
™ & © 2025 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.