Tech Giants Join Forces to Shield Children from AI’s Reach
Artificial intelligence developers including OpenAI, Anthropic, Stability AI, Google, Meta, and Microsoft have pledged to bolster security measures to protect children during the development and deployment of AI models. This effort is spearheaded by the nonprofits All Tech Is Human and Thorn, dedicated to child safety.
On this page
Artificial intelligence developers including OpenAI, Anthropic, Stability AI, Google, Meta, and Microsoft have pledged to bolster security measures to protect children during the development and deployment of AI models. This effort is spearheaded by the nonprofits All Tech Is Human and Thorn, dedicated to child safety.
Under these commitments, companies are obliged to address risks associated with child safety proactively and conduct more rigorous data checks to detect materials related to child sexual abuse and other harmful content. Furthermore, developers have vowed to enhance model protection against exploitation.
OpenAI representatives highlighted their existing efforts in this area, including limiting model capabilities to prevent unwanted content creation, setting age restrictions on applications, and collaborating with international organizations to safeguard children. The company is now poised to implement further changes.
As a result of this agreement, developers are likely to fine-tune language and other models, which may affect the quality of responses to any queries, rather than simply restricting the output of prohibited answers.
The content on The Coinomist is for informational purposes only and should not be interpreted as financial advice. While we strive to provide accurate and up-to-date information, we do not guarantee the accuracy, completeness, or reliability of any content. Neither we accept liability for any errors or omissions in the information provided or for any financial losses incurred as a result of relying on this information. Actions based on this content are at your own risk. Always do your own research and consult a professional. See our Terms, Privacy Policy, and Disclaimers for more details.