Skip to main content

CharacterAI Restricts Chatbot Use to Adults Amid Safety Concerns

What Happened

Character.AI, a leading AI chatbot platform, has announced it will ban users under 18 from accessing its chatbots. The company will soon require age verification to enforce compliance with the new policy. This decision follows rising scrutiny over minors interacting with AI bots, as parent and regulator concerns mount worldwide regarding privacy, inappropriate content, and a lack of adequate safeguards. Character.AI has not specified the verification method but emphasized safety and responsible AI deployment. The change will impact millions of young users and set a precedent for access controls across AI-powered services.

Why It Matters

This policy shift highlights the AI industry’s growing response to child protection and ethical standards. As chatbot technology expands, companies face increasing pressure to design responsible usage frameworks. Character.AI’s move could influence regulations and operational expectations across the sector. Read more in our AI News Hub

BytesWall Newsroom

The BytesWall Newsroom delivers timely, curated insights on emerging technology, artificial intelligence, cybersecurity, startups, and digital innovation. With a pulse on global tech trends and a commitment to clarity and credibility, our editorial voice brings you byte-sized updates that matter. Whether it's a breakthrough in AI research or a shift in digital policy, the BytesWall Newsroom keeps you informed, inspired, and ahead of the curve.

Related Articles