Character AI Introduces Under-18 Chat Restrictions Amid Safety Lawsuits
What Happened
Character.AI, an AI chatbot platform, announced new restrictions designed to limit chat functionalities for users under 18 after facing lawsuits that allege its service contributed to teen suicides. The move follows mounting legal and regulatory pressure, as the company is challenged about the safety of its AI-driven conversations with minors. These changes aim to make the platform safer for young people and address concerns from parents and advocacy groups. It is not yet clear what specific limitations will be enforced, but Character.AI said they will act quickly to implement new safeguards and enhance moderation for underage users.
Why It Matters
This announcement highlights the urgent debate around safeguarding children using AI chatbot technologies. As more minors interact with AI, responsibility is shifting to tech firms to ensure child protection. Regulatory action may accelerate for the entire AI sector to prevent similar harms. Read more in our AI News Hub