Roblox, one of the world’s most popular gaming platforms, has unveiled new safety measures allowing parents to block their children from accessing specific games and experiences.
The latest updates aim to enhance parental control over their children’s online activity and provide greater transparency on the platform.
The new features, which apply exclusively to users under the age of 13 with parental controls enabled, will also allow parents to block or report their children’s friends.
Additionally, Roblox will offer more detailed insights into the games that young users are playing, strengthening parental oversight in an effort to curb exposure to inappropriate content.
The announcement follows comments made by Roblox CEO Dave Baszucki in an interview with the BBC, where he advised parents to keep their children off the platform if they felt uncomfortable with it.
His remarks came in response to growing concerns over the presence of explicit or harmful content within certain games on Roblox, a platform that has become a dominant force in online gaming for children.
Roblox is particularly popular among younger players, with statistics showing it is the most-used gaming site in the UK for children aged eight to twelve.
However, the platform has faced criticism over reports that some children have encountered inappropriate content while navigating the user-generated gaming environment.
Despite these concerns, Baszucki emphasized that Roblox remains committed to ensuring user safety, stating that “tens of millions” of people have had “amazing” experiences on the platform. He reaffirmed the company’s dedication to vigilance in monitoring and addressing safety issues.
Matt Kaufman, Roblox’s Chief Safety Officer, expressed confidence in the new safety measures, stating: “These tools, features, and innovations reflect our mission to make Roblox the safest and most civil online platform in the world.”
Kaufman underscored the company’s ongoing commitment to strengthening security measures, particularly for its younger audience.
The UK’s communications regulator, Ofcom, responded to the announcement, calling the new measures “encouraging.” However, a spokesperson cautioned that “tech companies will have to do a lot more in the coming months to protect children online.”
Ofcom has been increasingly scrutinizing online platforms to ensure they implement robust protections in compliance with the UK’s evolving digital safety regulations.
The implementation of these enhanced parental controls aligns with broader efforts by tech companies to address child safety concerns in the digital space.
As online gaming and social platforms continue to grow, industry leaders face mounting pressure from regulators and advocacy groups to prevent exposure to harmful content and interactions.
While the new Roblox measures represent a step forward in child safety, the debate over the platform’s ability to effectively moderate user-generated content remains ongoing.
Parents, regulators, and safety advocates will likely continue to monitor how the platform enforces these new safeguards and whether further improvements will be necessary in the future.
For now, Roblox users under 13 with parental controls enabled will have access to an added layer of protection, giving parents greater oversight and control over their children’s digital experiences on the platform.
This article was created using automation technology and was thoroughly edited and fact-checked by one of our editorial staff members