Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Why did Laura Loomer leak that crazy deposition?

    August 15, 2025

    Make Your Bed Rock With the Best Mattress for Sex

    August 15, 2025

    Why Trump Flip-Flopped on Nvidia Selling H20 Chips to China

    August 15, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Character.AI Gave Up on AGI. Now It’s Selling Stories
    Business

    Character.AI Gave Up on AGI. Now It’s Selling Stories

    News RoomBy News RoomAugust 14, 20253 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    “AI is expensive. Let’s be honest about that,” Anand says.

    Growth vs. Safety

    In October 2024, the mother of a teen who died by suicide filed a wrongful death suit against Character Technologies, its founders, Google, and Alphabet, alleging the company targeted her son with “anthropomorphic, hypersexualized, and frighteningly realistic experiences, while programming [the chatbot] to misrepresent itself as a real person, a licensed psychotherapist, and an adult lover.” At the time, a Character.AI spokesperson told CNBC that the company was “heartbroken by the tragic loss” and took “the safety of our users very seriously.”

    The tragic incident put Character.AI under intense scrutiny. Earlier this year, US senators Alex Padilla and Peter Welch wrote a letter to several AI companionship platforms, including Character.AI, highlighting concerns about “the mental health and safety risks posed to young users” of the platforms.

    “The team has been taking this very responsibly for almost a year now,” Anand tells me. “AI is stochastic, it’s kind of hard to always understand what’s coming. So it’s not a one time investment.”

    That’s critically important because Character.AI is growing. The startup has 20 million monthly active users who spend, on average, 75 minutes a day chatting with a bot (a “character” in Character.AI parlance). The company’s user base is 55 percent female. More than 50 percent of its users are Gen Z or Gen Alpha. With that growth comes real risk—what is Anand doing to keep his users safe?

    “[In] the last six months, we’ve invested a disproportionate amount of resources in being able to serve under 18 differently than over 18, which was not the case last year,” Anand says. “I can’t say, ‘Oh, I can slap an 18+ label on my app and say use it for NSFW.’ You end up creating a very different app and a different small-scale platform.”

    More than 10 of the company’s 70 employees work full-time on trust and safety, Anand tells me. They’re responsible for building safeguards like age verification, separate models for users under 18, and new features such as parental insights, which allow parents to see how their teens are using the app.

    The under-18 model launched last December. It includes “a narrower set of searchable Characters on the platform,” according to company spokesperson Kathryn Kelly. “Filters have been applied to this set to remove Characters related to sensitive or mature topics.”

    But Anand says AI safety will take more than just technical tweaks. “Making this platform safe is a partnership between regulators, us, and parents,” Anand says. That’s what makes watching his daughter chat with a Character so important. “This has to stay safe for her.”

    Beyond Companionship

    The AI companionship market is booming. Consumers worldwide spent $68 million on AI companionship in the first half of this year, a 200 percent increase from last year, according to an estimate cited by CNBC. AI startups are gunning for a slice of the market: xAI released a creepy, pornified companion in July, and even Microsoft bills its Copilot chatbot as an AI companion.

    So how does Character.AI stand out in a crowded market? It takes itself out of it entirely.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous Article$25 Off Exclusive Blue Apron Coupon for August 2025
    Next Article OpenAI Designed GPT-5 to Be Safer. It Still Outputs Gay Slurs

    Related Posts

    Why Trump Flip-Flopped on Nvidia Selling H20 Chips to China

    August 15, 2025

    Inside the Biden Administration’s Gamble to Freeze China’s AI Future

    August 15, 2025

    A DOGE AI Tool Called SweetREX Is Coming to Slash US Government Regulation

    August 15, 2025

    Senators Press Howard Lutnick’s Former Investment Firm Over Tariff Conflict of Interest Concerns

    August 14, 2025

    The Kryptos Key Is Going Up for Sale

    August 14, 2025

    Why You Can’t Trust a Chatbot to Talk About Itself

    August 14, 2025
    Our Picks

    Make Your Bed Rock With the Best Mattress for Sex

    August 15, 2025

    Why Trump Flip-Flopped on Nvidia Selling H20 Chips to China

    August 15, 2025

    The Best Early Labor Day Mattress Sales on Our Favorite Models

    August 15, 2025

    Inside the Biden Administration’s Gamble to Freeze China’s AI Future

    August 15, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    News

    Anker’s 3-in-1 Qi2 charging station has returned to its Prime Day low

    By News RoomAugust 15, 2025

    If you’ve ever juggled a phone, a smartwatch, and a pair of wireless earbuds, you…

    Anthropic has new rules for a more dangerous AI landscape

    August 15, 2025

    A DOGE AI Tool Called SweetREX Is Coming to Slash US Government Regulation

    August 15, 2025

    Sam Altman on ChatGPT 5 backlash and the future of OpenAI

    August 15, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.