World News Intel

Following weeks of controversy and mounting public scrutiny, the popular social gaming platform Roblox has updated its parental controls and content rating system.

There have been several recent reports about the dangers of online gaming platforms. According to one study, such platforms have been used by extremists to meet and lure teen boys into far-right radicalization.

In July, investigative journalists published an exposé about sexual predators on Roblox. Another study advancing similar claims was released shortly after. Roblox has rejected the claims, saying safety and civility are “foundational” to the company. Child safety campaigners have called on regulators to act following the reports.

Review website CommonSense Media currently rates Roblox as “for age 13+ based on continuing challenges with problematic content.”

While some experts remain sceptical about Roblox’s updates, my research shows that many of the changes they’ve made do indeed have the potential to tangibly improve the safety and well-being of the 34.5 million children who use Roblox every day. But as with any minimally regulated social media that is used by millions of people with their own interests and agendas, Roblox is also unpredictable. And sometimes that’s dangerous.

Child-Appropriate Game Design project

I’ve been analyzing Roblox for the past two years as part of an ongoing study called the Children and Age-Appropriate Game Design project, which looks into what children, developers and policymakers think is, and isn’t, age appropriate in digital games. We’ve been talking to the same 34 kids aged six to 12 once a year to find out their ideas about how games can be designed and regulated to better support children’s rights.

Roblox is one of the games our participants talk about the most because it’s both prominent among their peers and a multipurpose space. Roblox isn’t just a game — it’s a platform that invites players, companies and advertisers to create their own games and virtual items to share or sell to other users.

Tens of millions of user-created games (called “experiences”) have been made so far. This seemingly infinite “free” library draws almost 89 million daily active users — over 40 per cent of whom are aged 12 and under.

As found in previous research, the children in my study use Roblox in diverse ways: to hang out with their friends after school, as an engine for launching future careers as game developers and for fun and information.

Why an update was needed

Data from Statistics Canada shows that the number of police-reported incidents of online child sexual exploitation in Canada are rare. Nonetheless, such incidents expose how the safeguards of online platforms fail to provide robust protections for some children, especially vulnerable and at-risk kids.

Research suggests that Roblox, like many other social media platforms, struggles to effectively moderate the incredible volume of chat messages (a reported 2.4 billion a day) and content generated by its users, some of whom are highly skilled at bypassing content moderation.

None of the kids in our study have described encountering anything on Roblox that they, or we, consider harmful. But several told us they play with the chat turned off to avoid “inappropriate” users like “creepy people” and “scammers,” or content like swear words.

Most of the kids have strategies for avoiding or managing what they see as the negative sides of online gaming. However, none of them believe they have the controls, filters or information needed to make the platform age appropriate. Parents play a prominent role, but they don’t all use or talk to their kids about safety settings either.

The new updates make parental controls more visible and accessible, and replace age-based ratings with a maturity-based system.
(Shutterstock)

How safety updates can support kids

There are two main ways that Roblox’s new updates have the potential to improve children’s safety and well-being.

First, they’ve made the parental controls more visible, nuanced and accessible. Parents will now create a parent account using automated identity verification, through which they can see and manage settings for the type of content, ads and communication their children have access to. This model has been applied on other platforms, such as Microsoft’s Xbox platform, and it has the potential to increase parental awareness and use of safety settings.

The new format is supported by previous research showing that Roblox’s old safety settings and policies were challenging for parents to navigate and failed to address some of the issues parents were most concerned about. The change could also encourage better communication between parents and children about safety settings and issues, which is shown to build children’s resiliency and mitigate risk of harm. They will both be informed about upcoming updates and their accounts will be visibly and functionally linked.

Second, Roblox’s age-based ratings have been replaced by a maturity-based content labelling system. Maturity reflects the type and amount of violence, gore, crude humour and, notably, “fear” users might encounter in an experience. The labels are voluntary, but experiences published without one will only be accessible to users 13 and over.

Such labels are important. Encountering inappropriate content is a major concern for the kids in our study and the example they keep coming back to is playing a game on Roblox that turns out to be too scary. While they also agree that some scary games can be fun, they want clear warnings about scary content so they can make informed choices about it.

The shift to maturity instead of age also aligns with how children themselves talk about what’s appropriate and inappropriate. None of the kids in our study think that age alone is a reliable way to determine appropriateness. Instead, they say it depends on the individual child’s maturity, preferences and aversions.

There’s still work to be done, but safety updates such as these are a move in the right direction. They reflect some of children’s own concerns and promise greater support for children’s rights and safety moving forward.

Perhaps most importantly, this move plants the seeds of a rights-based response to the risks children face online; one embedded in design and evidence-based policies.

Share.
Leave A Reply

Exit mobile version

Subscribe For Latest Updates

Sign up to best of business news, informed analysis and opinions on what matters to you.
Invalid email address
We promise not to spam you. You can unsubscribe at any time.
Thanks for subscribing!