TECHNOLOGY: Roblox introduces age-appropriate access — we nuked it

Monday, April 20, 2026

https://techcrunch.com/2026/04/13/roblox-introduces-kids-and-select-accounts-for-age-appropriate-access-to-games-and-chat/

Roblox introduces ‘Kids’ and ‘Select’ accounts for age-appropriate access to games and chat
Aisha Malik
5:15 AM PDT · April 13, 2026

*** begin quote ***

The move comes as Roblox implemented mandatory age checks in January for all users who want to access chats. The same age-check technology will be used to assign users to these new accounts. As part of the changes, users who haven’t completed age checks will only be able to play a selection of games rated for younger audiences.

*** end quote ***

​We “solved” the ROBLOX problem by nuking it. Argh!

I think that we don’t have to wait for app developers to fix their app.  ROBLOX was a “near and present danger”. ROBLOX app developers were too slow to fix it.  So we did.  No need for Gooferment diktats!  Parents just need to take control.

Argh!

— 30 —


TECHNOLOGY: What’s the boundary between “backdoor for government surveillance” and crime prevention?

Friday, December 13, 2024

https://techcrunch.com/2024/12/08/apple-sued-over-abandoning-csam-detection-for-icloud/

Apple sued over abandoning CSAM detection for iCloud
Anthony Ha
10:26 AM PST · December 8, 2024

*** begin quote ***

Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM).

The lawsuit argues that by not doing more to prevent the spread of this material, it’s forcing victims to relive their trauma, according to The New York Times. The suit describes Apple as announcing “a widely touted improved design aimed at protecting children,” then failing to “implement those designs or take any measures to detect and limit” this material.

Apple first announced the system in 2021, explaining that it would use digital signatures from the National Center for Missing and Exploited Children and other groups to detect known CSAM content in users’ iCloud libraries. However, it appeared to abandon those plans after security and privacy advocates suggested they could create a backdoor for government surveillance.

*** end quote ***

In this case, I think the risk of “backdoor for government surveillance” is less valuable than the prevention of Child Abuse.  IMHO.

I’d be pushing the tech companies hard in the war against children’s exploitation.

Maybe even their devices (i.e., iPhone, Android phones, tablets) should use digital signatures from the National Center for Missing and Exploited Children.

—30—