Roblox defends expanded age‑checks after parents raise concerns over errors
Roblox defends expanded age‑checks after parents raise concerns over errors
Roblox has stood by its decision to broaden its child safety framework, despite criticisms from parents about potential inaccuracies in the game’s age assessment tool. The company, which reports 144 million active users globally, is implementing age-specific account tiers—Roblox Kids and Roblox Select—to control access to features, content, and interaction levels based on user age. These changes aim to limit communication between younger users and adults, as well as filter the types of experiences available to children under 16.
Parents shared with the BBC that some children have been misclassified as adults during the age-check process, which they claim can weaken protective measures. Matt Kaufman, Roblox’s chief safety officer, explained that the age-estimation system, incorporating facial recognition technology, now applies to over half of the platform’s daily users, spanning tens of millions worldwide. He noted the system typically assigns ages within a 1.4-year margin for users under 18.
Kaufman emphasized that the technology is more dependable than relying on self-reported ages, stating,
“When you ask them that simple question, users are going to tell you whatever they want to tell you in order to get access.”
The updated system reinforces existing age checks for chat features, grouping users into age bands to minimize interactions between children, teens, and adults. It now extends this approach to account types, restricting unverified users to child-friendly content and blocking direct communication on the platform.
Parental concerns and system adjustments
While Roblox has not disclosed specific data on misclassifications, it outlined methods for correcting errors, such as age resets, appeals, and ID verification. Parents may also block games or manage messages until a child reaches 16. A developer mentioned to the BBC that continuous parental oversight is essential for safety, but Kaufman argued that the opinion of a single developer among over two million should not define the platform’s safety standards.
Recent changes follow a case where a 14-year-old girl’s mother was targeted by an 18-year-old man on the platform, leading to the sharing of explicit images. To determine content suitability for under-16s, Roblox considers factors like game duration, developer history, and usage patterns. Games with social or free-form elements will no longer be automatically available on Kids and Select accounts.
Expert perspective and global context
Sonia Livingstone, a professor at the London School of Economics, called Roblox’s response “encouraging,” but added,
“mounting evidence its platform continues to pose real risks to children’s safety.”
She stressed the need for independent validation of moderation tools, effective help systems, and transparency in age checks to prevent commercial profiling. The updates come amid rising international scrutiny of tech companies to safeguard children online, with the UK enforcing new obligations under the Online Safety Act and several nations proposing limits on social media use for those under 16.