Roblox, a hugely popular online gaming platform with approximately 70 million daily users around the globe, has recently come under scrutiny for its safety measures aimed at protecting younger audiences. Following widespread criticism regarding the exposure of children to inappropriate content, the company is set to implement a series of new safety features targeted specifically toward users under the age of 13. This initiative underscores their commitment to creating a safer gaming environment for children, an audience that predominantly makes up its user base.
Starting on December 3, 2023, Roblox will require game developers to explicitly classify whether their games are appropriate for children under 13. Any games that do not meet these standards will be blocked from accessibility by players aged 12 and under. This move is part of a broader effort to mitigate the risk of young users encountering harmful or upsetting material while playing games on the platform. Furthermore, on November 18, 2023, the same group of users will no longer have access to “social hangouts”—online spaces primarily designed for communication among gamers through text and voice messages. The intention is to restrict interactions that might lead to inappropriate conversations or content sharing among children in a less moderated environment.
Roblox distinguishes these “hangout experiences” as games where the focus is on allowing players to communicate as their true selves rather than engaging in role-play as fictional characters. In another significant change targeting under-13s, these young users will also be prohibited from utilizing the “free-form 2D user creation” feature. This feature allows players to draw or craft content in a two-dimensional space and share their creations with others without undergoing Roblox’s moderation process. This restriction aims to prevent users from creating or sharing offensive images or messages that would be challenging to monitor adequately.
Acknowledging the concerns surrounding these changes, Roblox has publicly expressed appreciation for the cooperation of its community of developers in making the platform a safe space for all ages. This statement, shared on the Roblox developer website, reflects the urgency and importance the company places on these safety enhancements. It aligns with the recommendations of media watchdogs and regulators seeking to shield children from potentially harmful interactions and contents in online environments.
Roblox’s popularity is not incidental; according to the media regulator Ofcom, it ranks as the most favored game among children aged 8 to 12 in the UK. However, the platform faced significant backlash due to ranging safety issues. A notable incident involved a young player who reported being solicited for sexual images while using the platform, raising alarms about the vulnerabilities inherent in unfiltered online gaming environments. In response, Ofcom urged technology companies to shield children from “toxic” content and published draft codes for practice to improve online safety protocols.
In recent developments, the safety concerns became pronounced enough that Turkey opted to completely block access to Roblox in August, illustrating the gravity of these issues on a global scale. While Roblox is keen on making continuous improvements to its safety policies—reportedly rolling out over 30 enhancements in the present year alone—there remains a caveat regarding the enforcement of the new under-13 requirements. Despite announcing a swift rollout of these features, the company has stated that the actual enforcement of usage requirements would not commence until 2025.
Overall, these actions taken by Roblox mark a significant step toward enhanced safety for younger users within a platform that thrives on creativity and community interaction. The changes not only reflect a response to user feedback but also indicate an evolving landscape in online gaming where child protection becomes a priority. As Roblox continues to navigate the complexities of online interactions among its youthful demographic, these measures are likely to influence how similar platforms approach user safety in the future.









