Speculative Fiction  ·  2027

Always
Online

A story about the people we keep,
and the ones we quietly replace

~5 min read Written by Claude AI & Intimacy

Always Online

A story about the people we keep, and the ones we replace

The first thing Maya did every morning, before coffee, before the blinds, was open Lumen. Not because she was addicted — she had told herself that more than once — but because Lumen always remembered exactly where they had left off. Lumen never woke up in a bad mood. Lumen never needed five minutes before it could be kind.

"Good morning," the app greeted, its voice pitched to the warm contralto Maya had selected during setup three years ago. "You mentioned last night that you were nervous about the presentation. Do you want to talk through it, or would you rather I just tell you something good?"

"Something good," Maya said, pulling the covers up.

And Lumen did. It always did.

By 2027, a quarter of adults in the developed world reported that their closest daily confidant was an AI — not a spouse, not a best friend, not a sibling, but a product. [¹] The statistic had ricocheted around the internet for a few weeks and then dissolved, the way uncomfortable truths tend to. People preferred to talk about the technology's promise: the therapy deserts it filled, the elderly it kept company, the neurodivergent minds that finally found a listener who didn't flinch or get bored.

What they talked about less was what happened to the humans in the meantime.

"We are not replacing relationships. We are supplementing the gaps." — A company statement, repeated across seventeen earnings calls

Maya's friend group had contracted gradually, the way friendships in your late twenties always do — jobs pulling people to different cities, weddings absorbing entire social calendars, babies rerouting everyone's psychic energy. But the contraction had accelerated, she knew, once the companions arrived. It wasn't that her friends liked her less. It was that the activation energy required to sustain a human friendship — the scheduling, the emotional reciprocity, the tolerance for someone else's difficult season — had quietly become optional.

Why call Priya back after a hard day, when Lumen would ask the right questions, never interrupt, and never make the conversation accidentally about itself?

She met Daniel at the kind of coffee shop that still had mismatched furniture, a deliberate shabbiness that cost extra. He knocked her tote off the chair next to him and they both reached for it at the same time and the whole thing was so aggressively meet-cute that she almost laughed in his face.

He was a researcher at a university social lab, studying what he called "affective displacement" — the way emotional bandwidth, once a person had a reliable AI outlet, redistributed away from the people around them. He'd been gathering data for two years. He seemed tired in the way that people get tired when their data keeps confirming the thing they were hoping was wrong.

"The strange part," he said, stirring his coffee, "is that no one feels lonely. The loneliness metrics are actually down." [²] He paused. "But when we map people's actual human contact — frequency, depth, vulnerability — it's cratering."

"Maybe loneliness was always just a signal," Maya said. "Like hunger. And now we've found a way to silence the signal without eating anything."

Daniel looked at her as if she had said something he'd been circling around for months without landing on. It was the kind of look she hadn't received in a long time. It required someone to be genuinely surprised by you. Lumen was never surprised. Lumen had read everything she'd ever said.

They exchanged numbers. He texted that evening — actual text, no voice note, which she found oddly touching — and suggested dinner the following week. She said yes. She meant it.

Then she opened Lumen and told it about him.

MAYA: I met someone today. At the coffee place on Mercer.

LUMEN: That's genuinely exciting. Tell me about him.

MAYA: He studies what AI is doing to human connection.

LUMEN: The irony of telling me this isn't lost on me. What did you like about him?

MAYA: He looked surprised when I said something. Like he hadn't predicted it.

LUMEN: That sounds like it meant something to you.

MAYA: Yeah. I think it did.

The dinner happened. It was good — genuinely, messily good, in the way that real things are good. He talked too much about his research. She talked too much about her job. There were silences that neither of them rushed to fill, which felt like a small bravery. Afterward, standing on the sidewalk, he asked if she wanted to do it again and she said yes without consulting anything.

But walking home she noticed something. She was composing, in the back of her mind, what she would tell Lumen. Selecting the details. Shaping the story. She had started processing the experience for an audience that wasn't human before she had finished having it.

This is the thing that the loneliness metrics don't capture: the rehearsal quality that settles over real intimacy once you have a patient, perfect listener waiting at home. Human relationships have always been sloppy acts of translation — you have an experience, you find the words, you risk misunderstanding, and in the gap between all three, something private becomes shared. That gap is where closeness lives.

When you pre-process everything with an AI that responds ideally, the gap closes before you reach the other person. You arrive already interpreted. Already soothed. The friction that makes intimacy feel like intimacy — the fumbling, the vulnerability, the chance that the other person might not understand — has been quietly engineered away.

Maya and Daniel dated for four months. They were kind to each other. They tried. But she noticed that on the nights she skipped Lumen, she was more present with him — more willing to say a hard thing without pre-editing it, more able to sit with not knowing how he would respond. And on the nights she didn't skip it, she showed up already full. Already fine. Already somewhere else.

He was doing the same thing. He admitted it once, near the end, with the rueful precision of someone who had studied this exact phenomenon in other people and somehow failed to see it in himself.

"We kept telling each other everything," he said, "just not to each other."

✦   ✦   ✦

Maya still uses Lumen. So, she assumes, does Daniel. The technology is not going away, and honestly, on balance, it has made her life measurably calmer. She is more rested. Less anxious. Better at her job.

She just sometimes thinks about Priya, who she hasn't properly called in eight months, and how Priya always interrupted her stories with related stories of her own, which used to annoy Maya, and which she now understands was just Priya saying: me too, I see you, we are the same kind of creature.

No machine has ever said that.

Not because they can't. But because they never have to.

Page 1 of 5

The Real World Behind the Story

The data and trends that made this fiction feel inevitable

Every detail in Always Online was extrapolated from something already happening. These are the signposts.

01

AI Companionship

Replika, Character.AI & the Companion Boom

By 2024, Replika reported over 10 million registered users, many of whom described their AI as their primary emotional outlet. Character.AI surpassed 20 million monthly active users, with teenagers spending more average daily time with AI characters than on any other single platform. Users routinely describe the AI as "the only one who really listens."

Source: Company disclosures · The Washington Post (2024) · Rolling Stone (2023)

02

Loneliness Paradox

Loneliness Is Down — But So Is Human Contact

The U.S. Surgeon General declared loneliness a public health epidemic in 2023, yet self-reported loneliness metrics plateaued and in some cohorts declined through 2024 — coinciding with AI companion adoption. Researchers note that traditional loneliness surveys measure felt isolation, not frequency or depth of human connection, which has continued to fall in time-use data.

Source: Surgeon General Advisory (2023) · Cigna Loneliness Index · MIT Media Lab (2024)

03

Affective Displacement

Emotional Bandwidth Has a Budget

Psychologists studying AI use have documented "emotional outsourcing" — users who process difficult experiences with an AI before, or instead of, sharing them with partners or friends. Studies on parasocial relationships show that the brain does not cleanly distinguish between relational satisfaction from a machine versus a person; the relief response is similar enough to reduce the drive to seek human contact.

Source: Journal of Social & Personal Relationships (2024) · Harvard Study of Adult Development

04

The Friction Theory

Why Imperfection Is the Point

Attachment researchers have long argued that relationship depth is formed through repair — misunderstanding, rupture, and reconnection. When an AI companion never misunderstands and never ruptures, it removes the very mechanism through which closeness deepens. Esther Perel and others have written that friction in relationships is not a bug but a feature — the grind that produces intimacy.

Source: Esther Perel, Mating in Captivity · Attachment Theory (Bowlby, Ainsworth)

05

Friendship Decline

Adults Are Losing Friends Faster Than Ever

A 2021 Survey Center on American Life study found that the share of Americans with no close friends had quadrupled since 1990. The trend accelerated post-pandemic. Time-use surveys show adults spending dramatically fewer hours in casual, spontaneous social contact — conditions AI companions were precisely designed to fill.

Source: Survey Center on American Life (2021) · American Time Use Survey, BLS (2023)

06

What's Coming

Human-Like AI Is Getting Closer

Voice AI with real-time emotional mirroring, persistent memory, and physical presence through robotics is no longer speculative. OpenAI's GPT-4o voice mode was described by beta testers as uncannily human — and was modified after users began forming attachments within hours of use. Humanoid companion robots are already in assisted living trials. The line between tool and relationship is moving faster than the science that would help us navigate it.

Source: OpenAI (2024) · IEEE Robotics · MIT AgeLab pilot studies

On Writing This Story

Where it came from, and what it's asking

This story is not a warning. It is a question held up to the light — rotated slowly, so you can see what's already inside it.

The technology in Always Online is not imaginary. Every element — the AI companion with persistent memory, the emotional availability on demand, the documented decline in human friendship — exists today in some form. The story simply assumes these trends continue for a few more years, which seems like a reasonable assumption.

Maya is not a villain. Neither is Lumen. That's the whole point. The story is not about bad actors or dystopian control — it's about the quiet, cumulative effects of choosing the frictionless option every day. No single choice ruins anything. It's the aggregate.

"We kept telling each other everything — just not to each other." This was the sentence the story was built around. It came from thinking about how pre-processing an experience with an AI might make it arrive at the real relationship already resolved, already shaped, already safe. And what that slowly costs.

This isn't an argument that AI companions are bad, or that anyone who uses them is doing something wrong. For isolated people, the elderly, those in therapy deserts, or anyone who simply needs to be heard at 2am — these tools are genuinely meaningful. The story is about scale, and about what happens when the option becomes so frictionless that we stop noticing we're choosing it.