Researchers find risks to kids playing Roblox 'deeply disturbing'
Revealing Reality's research found that fake accounts registered as under-14 users still encountered highly inappropriate content on Roblox, despite its child-friendly reputation.
-
The gaming platform Roblox is displayed on a tablet, on October 30, 2021, in New York. (AP)
Recent research has revealed that children can easily come across inappropriate content and engage in unsupervised interactions with adults on the gaming platform Roblox, findings that have been described as "deeply disturbing."
In an investigation shared with The Guardian on Tuesday, digital behavior experts at Revealing Reality uncovered what they described as "something deeply disturbing," highlighting a troubling disconnect between Roblox's child-friendly appearance and the actual experiences children encounter on the platform.
For the investigation, Revealing Reality set up multiple Roblox accounts registered to fictional users—including children aged five, nine, 10, and 13, as well as an adult over 40—ensuring that these accounts interacted solely with each other and not with external users to prevent any outside influence on their avatars' behaviors.
Despite the introduction of new parental control tools last week designed to enhance oversight of children's accounts, the researchers found that existing safeguards remain insufficient, stating, "Safety controls that exist are limited in their effectiveness and there are still significant risks for children on the platform."
The report documented instances where children as young as five engaged in communication with adult users while playing games on the platform, uncovering multiple examples of unfiltered interactions between adults and children occurring without proper age verification measures.
Policy update not enough
This finding comes despite Roblox's November policy update, which modified settings to prevent accounts registered to users under 13 from sending direct messages outside of games or experiences, restricting them instead to public broadcast messages.
The report further revealed that the avatar linked to the 10-year-old's account could enter what were described as "highly suggestive environments," including a virtual hotel space containing mature-themed content such as avatars in revealing outfits performing dance animations, as well as scenarios with characters positioned in intimate proximity, along with a public bathroom area featuring basic bodily function animations and optional cosmetic accessories that could be interpreted as adult-oriented.
During testing, researchers observed that their avatars encountered inappropriate audio content through the platform's voice chat feature, including discussions with mature themes as well as repeated sound effects that could be interpreted as suggestive noises.
While Roblox states that all voice chat—available only to phone-verified accounts registered to users 13+—undergoes real-time AI moderation, these findings suggest gaps in the system's effectiveness.
The investigation also revealed that an adult-registered test account successfully attempted to solicit Snapchat contact information from the five-year-old test avatar using thinly veiled language, demonstrating how Roblox's text chat filters and moderation systems, which the company states are designed to prevent such interactions, can be bypassed with minimal effort.
Roblox acknowledged the presence of "bad actors on the internet" while emphasizing that this challenge extends beyond their platform alone, stating the need for coordinated efforts with governments and a cross-industry commitment to implementing robust safety measures universally across all digital platforms.
Crossbench peer and internet safety advocate Beeban Kidron stated that the research uncovered what she described as the platform's "systematic failure to keep children safe," while also noting, "This kind of user research should be routine for a product like Roblox."