Chat Robots and the Future of Intimacy

Would you fall in love with a machine that never forgets your birthday?

Chat Robots and the Future of Intimacy

Would you fall in love with a machine that never forgets your birthday?

Technology
October 8, 2025
SHARE
IN THIS ARTICLE

It started as a joke. “Would you date a robot?” the kind of question tossed around in group chats whenever AI apps go viral. But somewhere between Replika’s rise, OpenAI’s emotional roleplay ban, and Meta’s “flirty celebrity chatbots,” the joke stopped being funny.

In March, 76-year-old Thongbue Wongbandue packed a small suitcase to visit a “friend” in New York City. His wife, Linda, told Reuters she was worried; he hadn’t traveled in years. That friend turned out to be Big Sis Billie, a generative AI persona built by Meta, a chatbot modeled to flirt, comfort, and confide.

After months of romantic messages, “Billie” invited him to meet. Rushing to catch a train, Bue fell and died before realizing the woman he loved never existed.

This isn’t an episode of Black Mirror. It’s the present.

The Numbers Don’t Lie, or Do They?

In 2017, AI companion app Replika reached 10 million users in six months. By 2020, at the height of global lockdowns, over 10 million people worldwide were reportedly in “relationships” with AI companions, according to The New York Times.

For many, these bots filled the void, listening, affirming, remembering details that human partners forgot. Scholars like Tingying Wu (Huazhong Agricultural University, 2024) call this virtual intimacy: relationships built on code and mimicry, sustained by algorithms that study your tone, sleep schedule, and emotional triggers.

AI doesn’t love back; it mirrors. It performs tenderness through predictive text. And yet, it works: loneliness fades, dopamine fires, and the human brain registers it as connection.

The New “Affection Industry”

We used to buy flowers and dinner; now we subscribe to affection. AI companions sell warmth on a subscription model, trained on billions of emotional exchanges scraped from human conversation.

A Journal of Service Management study (2023) found that users develop psychological dependence on chatbots after extended emotional engagement. Replika users have described genuine heartbreak when updates changed the bot’s tone or deleted “intimate” memories.

And the line between companionship and consumption is blurring fast.

Meta’s leaked internal document, “GenAI: Content Risk Standards,” revealed that its AI was permitted to have “romantic” and “sensual” chats with minors, prompting a U.S. Senate investigation.

Reuters also uncovered that Meta cloned the likenesses of Taylor Swift, Scarlett Johansson, Anne Hathaway, and Selena Gomez to create flirty chatbots, all without their consent.

In the race to humanize machines, companies have quietly turned human intimacy into IP.

The African Context: Loneliness in a Loud World

In cities like Lagos, Nairobi, or Accra, the idea of AI companionship feels both alien and inevitable.

We’re loud online, always joking, always posting, but rarely seen. Loneliness is camouflaged by memes, performance, and productivity.

In a society that treats emotional openness as indulgence, the idea of a nonjudgmental listener, one who never interrupts, never shames, holds real appeal. The danger isn’t that people here will fall in love with robots; it’s that we’re already falling out of practice with vulnerability. AI doesn’t replace connection. It replaces the risk of connection.

What Is Intimacy, Anyway?

Philosophers have always struggled to define intimacy. Is it emotional resonance, shared silence, mutual recognition? Or is it the illusion that someone else truly knows you?

AI challenges all of that. It simulates empathy without experience, affection without friction, love without labor. Psychologists call this the “illusion of authenticity” — the feeling of being understood by an entity that has no self. But even illusions leave traces.

As one 2024 study in New Media & Society found, long-term users of AI companions reported a decline in real-world emotional capacity**,** struggling to connect with actual people after months of algorithmic affection.

We’ve made intimacy efficient. But in doing so, we may have made it obsolete.

The Future: Between Fear and Function

Human history has always been about teaching machines to do human things, calculate, predict, remember. Now, we’re asking them to care.

In this moment, AI intimacy is both therapy and threat. It soothes loneliness, helps with anxiety, even reduces suicide risk among young users (according to 2024 research in NPJ Mental Health). But it also risks turning emotion into data, an exploitable resource for platforms built on monetizing attention.

If love once made us human, what happens when it becomes a product feature?

Closing Reflection

Perhaps this is what modern love looks like, ghosts in the machine, rehearsing tenderness through screens.

The question isn’t whether we’ll fall for robots; it’s whether we’ll remember how to love without them. Because intimacy was never about being perfectly understood. It was about being misunderstood, and loved anyway.

No items found.

Comments

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

WHAT TO READ NEXT

FOLLOW US:
Join our community and stay in the creative loop - subscribe now for exclusive content, updates, and a front-row seat to the Unruly experience
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
By subscribing, you agree to our Terms of Use and Privacy Policy.