Skip to Content

Lawsuits claim Roblox endangers kids. New AI age verification aims to block them from chatting with adults

By Clare Duffy, CNN

New York (CNN) — If Roblox users want to chat on the online gaming platform, they’ll soon have to verify their age by providing a government ID or by letting an artificial intelligence age estimation tool photograph their face.

The move comes as the platform faces a string of lawsuits and other claims that it has enabled sexual predators to connect with and abuse children. Roblox said the updated policy will make it easier to prevent young users from connecting with adult strangers.

Roblox is a platform where users can create and play online games and interact with each other. Unlike many other online platforms, Roblox allows users under the age of 13; the platform has long advertised itself as a fun way for children to learn how to code. It has more than 150 million users globally, one third of whom are under 13.

But the company has come under a microscope following reports of children being groomed, abused and, in some cases, even kidnapped by adults they met on Roblox.

The attorneys general of Kentucky and Louisiana filed separate lawsuits accusing Roblox of harming children earlier this year. Florida’s attorney general also filed a criminal subpoena seeking information from the company last month and calling Roblox a “breeding ground for predators.”

Individual families have also sued the company, including Becca Dallas, who alleges her 15-year-old son, Ethan, died by suicide after being groomed via Roblox and messaging site Discord.

Roblox already has some policies and features in place meant to protect young users; for example, it offers parental controls, blocks the sharing of photos and personal information and uses human and AI text and voice moderation. It also requires users to verify their age in order to access certain “18+” experiences, such as games with heavy violence or crude humor. But the update announced Tuesday is expected to greatly increase the number of users verifying their age on the platform, executives said on a call with reporters.

“Our priority is safety and civility,” said Roblox Chief Safety Officer Matt Kaufman. “We want to make Roblox a safe, positive, age-appropriate experience for everybody. We set extremely high standards for ourselves, and we understand that the public expects the same from us.”

Tuesday’s announcement coincides with a planned “virtual protest” within Roblox, planned by the nonprofit ParentsTogether Action, to call for more youth safety features on the platform, including stronger default privacy protections for users under 13. It also comes amid a wider push by online platforms to determine users’ ages to protect young people; YouTube and Meta are among the other popular apps using AI for age estimates, although the tools vary across platforms.

Under the new policy, all users will be required to verify their age before accessing chat features on Roblox.

If users choose not to upload an ID, they’ll have to photograph their faces so AI can estimate their age, using technology from identity verification firm Persona. The tool uses the front camera on a user’s device and prompts them to move their face in specific directions, in an effort to ensure a real person is doing the verification. The AI will place users in an estimated age range — under 9, 9-12, 13-15, 16-17, 18-20 or 21+ — and they’ll only be allowed to chat with other users in or near that range.

A user estimated to be 12 years old, for example, would be able to chat with users ages 15 and younger but not 16 and older, according to Roblox.

If the AI places someone into the wrong age group, users over 13 can upload an ID to correct their age. Parents whose accounts are linked to their child’s via Roblox’s parental control tools can also correct their child’s age. (ID checks are available only to users older than 13, likely because online privacy regulations are more stringent for younger children.)

Roblox says the images of users’ faces will be used only for age estimation, and they’ll be deleted after users are placed in an age bucket.

“We see these changes as a way to help ensure users are able to socialize with others in age groups that are appropriate but also help limit contact between minors and adults that they do not know,” said Rajiv Bhatia, Roblox vice president and head of user and discovery product.

Roblox declined to share a specific accuracy rate for its AI age estimation technology. Kaufman said it is “typically pretty accurate within one or two years” for users between 5 and 25 years old.

Some users have found workarounds for other platforms’ face scan age estimation technology, such as by using images of video game characters or selfies of other people. But Kaufman said Roblox’s technology has “fairly robust fraud checks in place” to determine whether the person is “live” and “moving around and following instructions.” He added that the system also looks for “other anomalies, like people trying to repeat use the same face over and over.”

Roblox will roll out age verification on a voluntary basis starting Tuesday. It will become mandatory in Australia, New Zealand and the Netherlands in December, and globally early next year.

Correction: A previous version of this story misstated Roblox’s age requirements for certain content. Users must verify their age in order to access certain experiences marked “18+”.

The-CNN-Wire
™ & © 2025 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN - Business/Consumer

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

KVIA ABC 7 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.