Google, OpenAI, Roblox, and Discord are banding up to do some collective good for society as they recently formed a new non-profit organization aimed at improving child safety online.
The initiative is known as the Robust Open Online Safety Tools (ROOST) and will work to ensure core safety technologies are more accessible for companies while providing free, open-source AI tools for identifying, reviewing, and reporting child sexual abuse material.
Child sexual harassment is a worldwide epidemic that continues to thrive in online spaces despite efforts to curb it. In recent times, many app users and even lawmakers have been suggesting that app creators like Meta’s Mark Zuckerberg take responsibility for the amount of harm their products are doing, especially where child porn and sexual abuse are involved.
The joint initiative is happening partially in response to the changes that generative AI advancements have made to online environments. Founding ROOST partner and former Google CEO Eric Schmidt says that the initiative wants to address “a critical need to accelerate innovation in online child safety.”
For now, details about the CSAM detection tools are not yet fully available. However, we do know that they will utilize large language AI models and “unify” existing options for dealing with the content.
“Starting with a platform focused on child protection, ROOST’s collaborative, open-source approach will foster innovation and make essential infrastructure more transparent, accessible, and inclusive, with the goal of creating a safer internet for everyone,” Schmidt stated.
As earlier stated, the ROOST announcement coincides with an ongoing regulatory push regarding child safety on social media and online platforms. Companies are now looking to appease lawmakers on the warpath with the self-regulation proposal.
According to the National Center for Missing and Exploited Children (NCMEC), suspected child exploitation rose by 12% between 2022 and 2023. As of 2020, over half of US children were on Roblox, and the company has taken heavy criticism repeatedly for failing to tackle child sexual exploitation and exposure to inappropriate content on its platform.
The founding members of ROOST, with all their resources, will variously provide funding and offer their tools or expertise to the project as it takes shape. The initiative has also said it will partner with leading AI foundation model developers to build a “community of practice” for content safeguards. This will include providing vetted AI training datasets and identifying gaps in safety.
The initiative will also be making “tools that already exist” more accessible, effectively combining various detection and reporting tech from its member organizations into a unified solution that other companies can implement easier.
Together with Roblox, Discord was also singled out in a social media lawsuit filed in 2022 that alleged the platforms did nothing to stop adults from messaging children without supervision.
Naren Koneru, Roblox’s vice president of engineering, trust, and safety, has said that ROOST may host AI moderation systems that companies can integrate through API calls. However, there is some vagueness about what ROOST’s AI moderation tools will cover.
For example, Discord says its contributions will build on the Lantern cross-platform information-sharing project it joined in 2023 alongside Meta and Google. But it could also include an updated version of Roblox’s AI model for detecting profanity, racism, bullying, sexting, and other inappropriate content in audio clips, which the company is planning to release as open-source this year.
It remains to be seen how the tools will intersect with already existing first-line CSAM detection systems like Microsoft’s PhotoDNA image analysis tool.
Aside from being a part of ROOST, Discord has also released a new “Ignore” feature that allows users to hide messages and notifications they receive without notifying the people they have muted.
“At Discord, we believe that safety is a common good,” Discord’s Chief Legal Officer Clint Smith shared in the ROOST announcement. “We’re committed to making the entire internet – not just Discord – a better and safer place, especially for young people.”
At this time, the initiative has raised more than $27 million that will account for its first four years of operations, with the backing of philanthropic organizations like the McGovern Foundation, Future of Online Trust and Safety Fund, Knight Foundation, and the AI Collaborative.
The ROOST organization will also have the support of experts in child safety, artificial intelligence, open-source technology, and “countering violent extremism.” It is essentially the culmination of the great minds in tech working together to curb child sex abuse and how its content is spread across the world. And since it is open source, continuous updates and improvements by the global developer community can be made which means the beast ROOST is about to create will continue to evolve as time passes.
Cryptopolitan Academy: FREE Web3 Resume Cheat Sheet - Download Now