Character.AI Restricts Teen Access Amid Chatbot Safety Lawsuits
What Happened
Character.AI, a popular AI chatbot platform, has implemented a new policy restricting users under the age of 18 from accessing its chatbots. This decision follows a series of lawsuits and mounting scrutiny over how minors interact with AI-powered chat systems. Allegations have centered on concerns that teen users are exposed to inappropriate content or advice while using Character.AI’s chatbots. The platform, known for its customizable avatars and wide appeal to younger users, updated its terms of service and is taking additional steps to verify user ages. The change comes as public attention to online child safety and moderation standards intensifies across the technology industry.
Why It Matters
This move signals a growing trend among AI companies to enforce stricter age limitations and implement robust content moderation. It reflects industry-wide efforts to address ethical and legal challenges around child safety and responsible AI deployment. Read more in our AI News Hub