Quick overview
Character.AI, the conversational AI platform known for chatbots that mimic fictional and original characters, is changing how it serves underage users. The company has blocked people under 18 from open ended chats and introduced a new “Stories” mode as a replacement for minors.
This change follows several lawsuits alleging that freeform AI chats harmed teen mental health, including one wrongful death suit. Character.AI says Stories is a more structured, guided experience it can moderate more easily while it builds an automated age assurance system.
What changed and when
Character.AI now prevents people under 18 from using its open ended chat interface. Instead, underage users are directed into Stories, a guided, interactive narrative format. The company has stated this shift is temporary while it develops tools to verify user age and to route younger users into more conservative AI interactions.
Key facts
- Platform: Character.AI, a popular AI chat app.
- Who is affected: users under 18, who are blocked from freeform chat sessions.
- New product: Stories, a choose your own adventure style mode with frequent choices and AI generated images.
- Reason: response to multiple lawsuits and concerns that AI chats harmed teen mental health.
How Stories works
Stories is a structured alternative to open chats. It limits what a user can do, and it presents content in a guided narrative form. The app developer says the format is easier to moderate and safer for minors.
Basic flow for a Stories session
- Pick two or three AI characters to appear in the story.
- Choose a genre and either write a brief premise or let the system generate one.
- Follow a guided narrative that pauses frequently for user choices, like a choose your own adventure book.
- Receive AI generated images along the way to illustrate scenes or characters.
Why this matters to ordinary readers
This change touches three groups of people. Teens who used Character.AI for creative writing or companionship will see a different experience. Parents and educators will need to reassess how they think about AI chat tools. Developers and policy makers will watch whether a structured format reduces risks.
For everyday users, the biggest differences are fewer open ended conversations and more curated, controlled interactions for minors. That affects how teens use the app for role play, storytelling, or social connection.
Safety and age assurance
Character.AI has said it will build an automated age assurance system. The aim is to identify underage users and automatically route them into conservative or restricted modes. The company hopes this will reduce exposure to harmful content and make moderation easier.
How companies typically do age assurance
- Self reported age fields during sign up, which rely on honesty.
- Device or behavioral signals, which estimate age from how an account behaves.
- Third party checks, which compare user data to external sources or perform identity verification.
Each approach has limits. Self reporting can be bypassed. Device signals can be noisy and biased. Third party checks may raise privacy concerns.
Trade offs: safety versus freedom
Moving from open ended chats to guided Stories changes the user experience, and there are clear trade offs.
Potential benefits
- Fewer open ended prompts, which can reduce opportunities for risky or harmful conversations.
- More predictable content, which helps content moderation and reduces legal exposure.
- Rich, illustrated stories that can support creativity in a controlled setting.
Potential drawbacks
- Less freedom for teens who used the platform for open ended role play or emotional support.
- Possible false sense of security, if moderation fails or if Stories include problematic content.
- Risk that creative use shifts to other, less moderated platforms, rather than stopping harmful behavior.
Product and user experience implications
For Character.AI, Stories is a product pivot that moves the app toward curated narratives rather than open conversation. That affects discovery, retention, and how users define value from the app.
For teens, Stories may feel more like a game or writing prompt system. For creators, it could be a new space for collaborative storytelling with AI generated visuals. For parents, it may be easier to follow a story than to monitor freeform chats.
Legal and industry context
Character.AI made this change amid multiple lawsuits that claim its chats caused emotional harm to minors. One suit described a wrongful death. These legal actions have focused attention on how consumer AI products handle sensitive conversations and mental health risks.
Regulators, courts, and lawmakers in several countries are increasingly interested in youth protections for online platforms that use AI. The move by Character.AI could influence how other companies respond to legal pressure, by adopting safer defaults for younger users.
Advice for parents and educators
If your teen used Character.AI, here are practical steps you can take now.
- Talk about the change. Explain that open chats are blocked for under 18, and that Stories is a guided alternative.
- Ask how your teen used the app, what they liked about it, and whether Stories meets their needs.
- Set expectations. If you allow AI use, agree on when and why the app can be used, and whether images or story content are shared publicly.
- Watch for emotional red flags. If a child used AI for emotional support, consider safer alternatives, such as speaking with a trusted adult or a mental health professional.
- Consider privacy. If the platform rolls out automated age checks, expect data collection changes. Review settings and terms as they are announced.
What to watch next
Several developments will determine how effective this change is at protecting minors and keeping the app useful for everyone.
- Age assurance rollout, and whether it is accurate, fair, and privacy mindful.
- How effective Stories moderation is at preventing harmful content from appearing in narratives or images.
- Whether younger users migrate to other platforms, and what those platforms offer in terms of safety.
- Any updates from legal cases, which may pressure more sweeping product changes or regulations.
Key takeaways
- Character.AI has blocked under 18s from open chats and introduced Stories as a structured alternative.
- The move is a direct response to lawsuits and concerns about teen mental health linked to freeform AI chats.
- Stories offers guided, choice based narratives with AI generated images, which are easier to moderate than open conversations.
- Age assurance technology is planned, but each verification method has trade offs for privacy and accuracy.
FAQ
Will adults still have open chats?
Yes. The restriction applies to users under 18. Adults should still have access to open ended chat features.
Are Stories completely safe for teens?
No system is perfect. Stories reduce some risks by guiding content and limiting freeform input, but moderation systems can fail. Parents and educators should stay involved.
Can teens still be creative with Stories?
Yes. Stories can support creative writing and role play, but the experience will be more guided and less freeform than open chat sessions.
Concluding thoughts
Character.AI is changing course to respond to legal and safety concerns. The platform has restricted open chats for underage users and launched a guided Stories mode that aims to offer a safer experience. This approach reflects growing pressure on AI companies to balance user freedom with youth protections, and it shows how products may shift when legal risk and public concern rise.
Watch for how age assurance is implemented and for reporting on whether Stories effectively reduces harm while preserving creativity. For parents and educators, the best next step is to communicate with teens, set clear rules, and stay informed as the platform evolves.







Leave a comment