Not even Disney wants to collab with Roblox

Not even Disney wants to collab with Roblox

In a new report by Variety, the magazine alleges that Disney is very reluctant to collaborate with Roblox because it doesn't consider it a safe platform for children. In the whirlwind of collaborations and crossovers that has been sweeping the gaming world since Fortnite opened the floodgates, giving up millions of dollars in revenue that Roblox's 150 million weekly users could generate is a very strong stance.

In this newsletter, we've often discussed the lawsuits Roblox is facing around the world due to the tens of thousands of child abuse cases that occur on the platform each year. Despite the security measures promised by the C-suite, the platform's reputation has become such that there's very little trust left from the public.

The ace the company thinks to have up its sleeve to solve all its problems is an age verification system based on the analysis of a series of selfies by an artificial intelligence. The system is currently being tested in the Netherlands, Australia, and New Zealand and will be available in the United States in January 2026.

It goes without saying that from the first few weeks of testing, it's clear that the system is not only extremely unsuccessful, but almost helps out sexual predators. Schlep, the YouTuber banned from Roblox for having six sexual predators arrested by him and his team this year alone, analyzed the existing features and completely rejected them.

"Verifying users' ages with a selfie and dividing them into age groups will do more harm than good," he said in a TikTok. The underlying philosophy of the update, in fact, is to divide users so that those aged 9-12 can only talk to their peers, and the same goes for those aged 13-16 and 16-18.

The main problem is that, currently, it's quite easy to use other people's photos and other systems both to appear older (and not be blocked by age restrictions), and to be registered as a minor. In practice, this allows a malicious user to register themselves as a 12-year-old on the platform and interact only with other children, far from the eyes of moderators and other "good guys" who spend their days patrolling shared spaces on the lookout for child predators.

This system doesn't just apply to text chat: matchmaking for online experiences will use age restrictions as a fast track to finding other players. This means that if a predator manages to trick the system and be registered as a 13-year-old, Roblox will do the work for them by placing them in a lobby with other 13-year-olds.

The catch is that, currently, Roblox's AI decision is final, so if a child is misidentified as an adult by the system, there's nothing they or their parents can do. The same goes the other way around, and there are already posts on Reddit from people selling their under-14 accounts after the AI ​​put them in the wrong cage.

The conclusion is the same as every in-depth article I've dedicated to Roblox: if you know of minors who use the platform extensively, try to convince them to abandon it as quickly as possible. The company doesn't care about the safety of its users: it's the parents, friends, and playmates who must take responsibility.

Thank you for reading the second weekend edition of Letter to a Gamer. This is the new look for this newsletter: one in-depth article per month on an important and under-researched topic.

Happy holidays, and see you in January with the next letter.
Riccardo "tropic" Lichene

Subscribe to Letter to a gamer

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe