CharacterAI Restricts Chatbot Use to Adults Amid Safety Concerns
What Happened
Character.AI, a leading AI chatbot platform, has announced it will ban users under 18 from accessing its chatbots. The company will soon require age verification to enforce compliance with the new policy. This decision follows rising scrutiny over minors interacting with AI bots, as parent and regulator concerns mount worldwide regarding privacy, inappropriate content, and a lack of adequate safeguards. Character.AI has not specified the verification method but emphasized safety and responsible AI deployment. The change will impact millions of young users and set a precedent for access controls across AI-powered services.
Why It Matters
This policy shift highlights the AI industry’s growing response to child protection and ethical standards. As chatbot technology expands, companies face increasing pressure to design responsible usage frameworks. Character.AI’s move could influence regulations and operational expectations across the sector. Read more in our AI News Hub