Roblox Corporation, the creators of the immensely popular online gaming platform Roblox, is introducing significant changes aimed at enhancing child safety in the digital environment. In a recently announced policy, the company has declared that users under the age of 13 will be prohibited from sending direct messages to other players. This initiative is part of a broader effort to safeguard children in virtual spaces, ensuring that interactions remain secure and monitored.
Under the new guidelines, children will not be able to engage in private messaging without explicit permission granted by a verified parent or guardian. This means that while young users can still partake in public chat functions within the games, they will not have the capability to send private messages unless their parents approve such actions. Parental oversight has been emphasized significantly, allowing guardians to manage their child’s account activities effectively. This includes monitoring the list of friends their child interacts with and setting daily limits on the time spent playing Roblox.
According to recent research conducted by the Office of Communications (Ofcom), Roblox has risen to be the leading gaming platform among children aged eight to twelve in the United Kingdom. However, as its popularity surges, there has been increasing pressure for the platform to implement stronger safety measures to protect its young audience. In response, Roblox has acknowledged the necessity of evolving its safety protocols alongside its expanding user base. Matt Kaufman, the chief safety officer at Roblox, revealed that the platform is utilized by approximately 88 million individuals each day, with over 10% of employees concentrated on enhancing security features.
The rollout of these changes is scheduled to commence on a Monday, with full implementation expected by March 2025. While this means that children will have some limitations, they will still be able to participate in communal conversations visible to all players in various games. It is essential to note that the modifications include a mechanism for parents to verify their identity and age via government-issued identification or credit card details. This verification process aims to ensure that only responsible adults can manage a child’s online interactions and activities.
Moreover, Roblox is revamping how content is classified on its platform, shifting from mere age recommendations to clearer, more descriptive “content labels.” These new labels will provide both parents and children explicit information on the nature of the games. For instance, the labels will encompass categories ranging from “minimal” – which may include slight violence or fear-inducing experiences – to “restricted,” which might feature more intense content like strong violence or explicit language. Under the new system, users younger than nine years old will only access “minimal” or “mild” content by default. However, parental consent will enable access to “moderate” experiences, while exposure to “restricted” games will only be permitted for verified users aged 17 and older.
This recent policy change is a continuation of Roblox’s commitment to maintaining a safe and user-friendly environment, especially in light of the impending regulations under the UK’s Online Safety Act. This legislation aims to protect minors from harmful and illegal material on digital platforms. Ofcom, the authority charged with enforcing such regulations, has warned that entities failing to safeguard children adequately may incur penalties. The watchdog plans to release its codes of practice for companies to follow in December, establishing clear standards for online safety.
Roblox’s proactive measures reflect a broader trend in the tech industry towards prioritizing child safety and ensuring that gaming environments are appropriate for young audiences. As digital spaces become increasingly intertwined with children’s daily lives, these policies are pivotal in fostering secure and enjoyable gaming experiences.









