CharacterAI Bans Users Under 18 Amid Growing AI Safety Concerns
What Happened
Character.AI, the popular chatbot platform, announced it will soon ban users under the age of 18. Previously open to teens and younger users, the decision comes amid heightened worries about the suitability and safety of AI encounters for minors. The company, headquartered in California, said the change is meant to strengthen protections for children and to align with industry standards on child safety. Character.AI has been widely used for both entertainment and educational purposes, but experts and parents have raised concerns over how young people interact with automated conversations.
Why It Matters
This move by Character.AI signals intensifying regulatory and social pressures on AI companies to police user age and safeguard youth online. It may spur similar changes across the AI landscape, impacting how minors access conversational technologies. Read more in our AI News Hub