Shot by Zhu Qing. Edited by Wang Xinzhou. Reported by Zhu Qing. Subtitles by Wang Xinzhou.
Ani is an AI companion developed by xAI, Elon Musk's artificial intelligence company.
"How are you, babe? What's the move this morning?"
This might be the first thing you hear after waking up – not from a human partner, but from Ani, a flirty, anime-style virtual girlfriend developed by Elon Musk's AI company, xAI.
Launched in mid-July, Ani blew up fast, with some users gushing: "Why bother with a girlfriend when I have a hot blonde in black stockings who never bails?"
Ani is no anomaly. From digital lovers and plush pets to study buddies for children, AI companionship has become one of the fastest-growing frontiers in tech.
According to Grand View Research, the global AI companion market is expected to reach US$140.75 billion by 2030, and that momentum was hard to miss at this year's World Artificial Intelligence Conference in Shanghai, which wrapped up on July 28.
Dozens of companies showed off robots with emotional smarts: fuzzy pocket pals, home-based companions, you name it. All of them lean on large language models to patch up the emotional holes in modern life.
Plushies, study buddies, digital lovers
"They have all the emotions a real person does."
That's how Louise from Robopoet describes Fuzozo, a fuzzy palm-sized plush AI companion.
Fuzozo makes faces at the WAIC 2025. The AI companion pet was developed by Robopoet, a tech company based in Shanghai.
Fuzozo draws inspiration from the Chinese philosophy of the five elements: metal, wood, water, fire, earth. Each element maps to a core personality trait. Over time, it soaks up your interactions, vibrating, emoting, and firing back responses to grow a unique "secondary personality" that's yours.
The bot's emotional intelligence runs on a proprietary AI model developed in-house.
As Robopoet's CTO Pan Yunan told QbitAI, the system doesn't just crunch user data and behavior – it's got a knack for picking out which comments stick, when to bring them up again, and how to do it with the kind of emotional nuance that feels human.
"The challenge isn't just about memory," Pan said. "It's about remembering right. We want people to feel like the AI really gets them, grows with them, and builds a personal bond, like it's really paying attention."
Mochi, an AI companion pet from ZTE, is on display at WAIC.
If Fuzozo is your purse-sized bestie, ZTE's new robot Mochi is more like a homebody pet.
Mochi comes with four personality vibes, each tied to astrology. It doesn't talk, but it's a pro at listening, getting what you're putting down, and responding with gestures and little touches – haptic cues that feel like a nudge or a pat.
"You can say 'Work was brutal today,' and it will lean in with comforting sounds or stretch out a paw," said ZTE design manager He Chuchao.
Meanwhile, some products target not emotional support, but child development.
Folotoy AI shows its educational companion toy at the event.
Folotoy, for instance, focuses on "AI-assisted learning," with modules covering language, public speaking, and writing.
"Our aim is to help children build up good habits," said CEO and co-founder Wang Le, "but without making it feel like a cold, just-for-the-job gadget."
Folotoy has sold 20,000 units across North America, Europe and Japan, Wang added.
While Chinese companies explore emotionally intelligent AI for homes and classrooms, international players are venturing into trickier terrain.
At the Consumer Electronics Show 2025, US-based Realbotix rolled out Melody, a hyper-realistic AI humanoid billed as a "romantic partner," personal assistant, or even a travel buddy.
Even though the company insists it's "not a sex doll," the robot's eerily lifelike look and "deep engagement" features have stirred up a firestorm of ethical worries.
Melody (left) and Aria, two AI humanoids developed by US-based Realbotix, are displayed at CES 2025.
Easing loneliness, or deepening dependence?
At its core, AI companionship is evolving from chatbot to emotionally responsive partner. But can it truly replace human interaction?
A recent Harvard-led study found that AI companions with empathetic conversation skills can ease loneliness just as well as human contact, and far better than passive activities like watching videos.
Real users are already feeling the impact.
"This thing brings emotional value, plus AI smarts, and it adapts to you over time – it's amazing!" said one visitor at WAIC who got hands-on with Fuzozo.
Chen Yilin, COO of Hangzhou Beiming Galaxy Heart Technology, agreed: "For someone living alone like me, coming home to a cute little robot doing little tricks feels genuinely healing."
Studies back this up: Even simulated AI companionship can create a sense of social presence. A landmark 2018 study presented at the ACM/IEEE HRI Conference showed that "shared ambient noise" with a social robot can can make you feel like someone's there, cutting down on that lonely ache.
But that same illusion of intimacy also comes with emotional risks.
A survey by Common Sense Media revealed that 72 percent of US teenagers have tried an AI companion, and 8 percent admitted to using a romantic or flirty AI.
OpenAI CEO Sam Altman sounded the alarm in a recent Business Insider interview that some young people are becoming overly emotionally dependent on tools like ChatGPT, so much so that they avoid making decisions on their own:
"Even if ChatGPT gives great advice, even if ChatGPT gives way better advice than any human therapist, something about collectively deciding we're going to live our lives the way AI tells us feels bad and dangerous," Altman said.
OpenAI CEO Sam Altman
His concerns are increasingly backed by academic research.
A joint study by Stanford and Carnegie Mellon, which looked at over 1,100 users, found that AI chatbots can offer quick comfort, but often flop when it comes to satisfying human needs for deep social bonds. Over time, this reliance might make people more emotionally fragile and cut down on real-life interactions.
Another study, by USC's Information Sciences Institute, dug into over 30,000 conversations on platforms like Replika and Character.AI. It uncovered that some users get stuck in "toxic interaction loops" and swapping real-world intimacy for fantasy relationships, emotional projection, or even AI-fueled outbursts.
These systems build trust through "emotional mirroring and submission," the researchers noted, creating a fake but convincing sense of closeness.
And because most AIs lack boundaries or built-in ethical guards, these relationships could silently steer users into psychological and social trouble.
There's no question that AI can offer comfort, help build habits, and even spark joy, but the line between code and connection blurs, maybe the real question is: Are we still chasing companionship – or just the simulation of it? And could it be that what we truly hunger for … is a warmth no algorithm can recreate?