“AI is expensive. Let’s be honest about that,” Anand Says
Growth vs. Safety
The mother filed the lawsuit in 2024 on behalf of her son who had committed suicide. wrongful death suit She filed a lawsuit against Character Technologies and its founders as well as Google and Alphabet alleging that the company had targeted her son. “anthropomorphic, hypersexualized, and frighteningly realistic experiences, while programming [the chatbot] to misrepresent itself as a real person, a licensed psychotherapist, and an adult lover.” A Character.AI spokeswoman was at the time. told CNBC This company was “heartbroken by the tragic loss” Taken “the safety of our users very seriously.”
Character.AI was put under scrutiny after the tragic event. The US senators Alex Padilla, and Peter Welch, both of whom are from the United States. wrote A letter was sent to several AI platforms including Character.AI expressing concerns over the use of AI in companionship. “the mental health and safety risks posed to young users” The platforms.
“The team has been taking this very responsibly for almost a year now,” Anand tells you. “AI is stochastic, it’s kind of hard to always understand what’s coming. So it’s not a one time investment.”
It’s important to note that Character.AI has been growing. This startup boasts 20 million active monthly users, who, on average spend 75 minutes per day talking to a chatbot (a). “character” In Character.AI jargon). In Character.AI’s parlance, 55 percent of its users are female. Over half of the company’s users are Gen Alpha or Gen Z. With that growth comes real risk—what is Anand doing to keep his users safe?
“[In] the last six months, we’ve invested a disproportionate amount of resources in being able to serve under 18 differently than over 18, which was not the case last year,” Anand Says “I can’t say, ‘Oh, I can slap an 18+ label on my app and say use it for NSFW.’ You end up creating a very different app and a different small-scale platform.”
Anand told me that more than 10 employees of the 70-strong company work exclusively on safety and trust. Anand tells me that they are responsible for creating safeguards, such as age validation and separate models designed for users below 18. parental insightsThis allows parents to monitor their teenagers’ usage of the application.
It includes the model for under-18s, which was launched in December last year. This model has been launched since December. “a narrower set of searchable Characters on the platform,” Kathryn Kelly, spokesperson for the company. “Filters have been applied to this set to remove Characters related to sensitive or mature topics.”
Anand believes that AI security will require more than technical improvements. “Making this platform safe is a partnership between regulators, us, and parents,” Anand says. It’s because of this that watching his daughter talk to a Character is so valuable. “This has to stay safe for her.”
The Companionship of a Dog
Market growth is phenomenal in the AI companionship sector. The AI companionship market is growing rapidly. cited by CNBC. AI startups want a piece of the pie: xAI released A creepy, pornified partner in July and even Microsoft bills Copilot, the AI chatbot is a companion to Copilot.
How does Character.AI differentiate itself in such a competitive market? It removes itself from the equation.

