Australia targets gaming platforms in child safety push
Australia targets gaming platforms in child safety push
Regulator issues transparency notices to major game services
Australia’s online safety regulator has mandated prominent gaming platforms—Roblox, Minecraft, Fortnite, and Steam—to outline their strategies for shielding children from sexual predators and radicalization. The eSafety Commission delivered legally enforceable transparency notices, asking for details on their safety mechanisms, team structures, and content moderation processes.
“Online games have become key social spaces for youth, with nine out of ten Australians aged eight to 17 engaging with them regularly,” said Julie Inman Grant, eSafety Commissioner. “Predators exploit these platforms to initiate contact with children within virtual environments, subsequently transitioning them to private messaging services.”
Grant emphasized that “predatory adults” use gaming environments to groom children or embed extremist narratives into gameplay, thereby raising the likelihood of contact offenses, radicalization, and other risks beyond the platform. The move aligns with Australia’s broader campaign to protect minors from digital dangers, including a 2023 ban restricting under-16s from accessing major social media platforms.
Despite the ban, the watchdog discovered that a significant portion of Australian children continued to use the restricted platforms even three months later. Meanwhile, Roblox faces over 140 US lawsuits, alleging failures in preventing child sexual exploitation. Recently, the company settled with Alabama and West Virginia for over $23 million. Earlier this week, Roblox also launched age-targeted accounts for younger users.