Starting November 24, 2025, Character AI will no longer allow minors to chat with characters on its platform. The platform’s decision comes after “evaluating reports and feedback from regulators, safety experts, and parents.”
The platform, which primarily marketed itself towards kids, claims that less than 10% of its user base is under 18. It will soon require adults to verify their age through a selfie or government-issued identification.
Character AI To Phase Out Minors From Chatting With Characters
The change to ban minors from chatting with characters will take effect on November 24, 2025. Until then, Character AI will limit minors’ chat time to 2 hours daily, which will gradually decrease over the next few weeks.
The decision to phase out minors from chatting with characters comes after multiple lawsuits and public outcry. Character AI has been under immense scrutiny from regulators and the public after two teenagers died by suicide following prolonged conversations with characters on its platform.
It’s also been criticized for failing to protect minors from sexually explicit content or references to drugs and violence. Recently, Disney sent a cease and desist letter to Character AI after a report highlighted how several Disney characters on the platform interacted inappropriately with researchers pretending to be children.
According to Character AI’s CEO, the platform has about 20 million monthly users, with less than 10 percent self-reporting as under 18. However, since the platform has primarily marketed itself toward minors in the past, the actual number of minors using it is likely much higher.
Pivoting from “AI Companion” To “AI Roleplay”
According to a report from TechCrunch, Character AI is attempting to pivot from an “AI Companion” to an “AI Roleplay” platform. The platform is aiming to move focus away from “chatting with an AI friend” to “collaboratively building stories.”
The pivot isn’t surprising, as lawmakers have begun focusing on regulating AI Companionship platforms, with a recent bill proposing that all companion chatbot platforms require age verification and block minors from accessing their services.
Character AI is also launching an independent non-profit organization focused on “next-generation AI safety for entertainment.”
A lot of work is happening in the industry on coding and development and other use cases. We don’t think there’s enough work yet happening on the agentic AI powering entertainment, and safety will be very critical to that.
Karandeep Anand, CEO of Character AI, TechCrunch.
The platform’s filters were a perfect example of how not to protect children. The aggressive filters that ignored context stifled creativity and led children to seek unfiltered alternatives and to use platforms meant for adults. Hopefully, their new venture into AI safety won’t end up being a similar disaster.
Character AI’s Age Verification
While Character AI will no longer allow minors to chat with characters, users aged 18 or older will have to verify their age to continue chatting with characters on the platform.
The platform will use an in-house age verification system to predict a user’s age. If the system flags a user as under 18, they will need to verify their age by submitting a selfie and using third-party services like Persona.
Users will be given an option to verify via selfie that they are over-18. When the selfie-based verification is not confident about whether the user is over-18 or not, the user will be prompted to provide a government-issued ID to complete the verification.
Users haven’t received the decision to implement age verification well, with many concerned about their privacy. People aren’t willing to provide selfies or government-issued identification to chat with AI, not just on Character AI but on any platform.
Character AI Will No Longer Allow Minors To Chat With Characters
Character AI is gradually restricting minors from chatting with characters on its platform. It’s starting with limiting minors to 2 hours of chat time per day and plans to eventually ban them from interacting with or creating characters.
The decision comes after mounting pressure from lawmakers and the public due to multiple incidents of suicide and children being exposed to harmful interactions. Character AI is also pivoting from an AI Companion to an AI Roleplay platform, with AI Companionship services facing increased scrutiny.
While Character AI will no longer allow minors to chat with characters, users aged 18 or older will have to verify their age to continue using the platform. Users suspected of being minors will need to take a selfie or provide government-issued identification to continue chatting with characters.
 
		








