
The first time I saw the movie Her was in 2013. I was a couples therapist, a single mom, and a first-time parent, navigating my own crumbling marriage and struggling with the stigma of divorce juxtaposed against an evolving professional identity as a “relationship expert.” Back then, my perspective was shaped by my circumstances. To me, Her seemed a story about regret, change, and loss; about how people outgrow each other and fall short of shared dreams. About brokenness and loneliness. But time changes everything: our memories, our stories—even vicarious ones shaped by a film—are shaped by the world we live in and by who we become in relation to it.
2013 was a different time—pre-pandemic, pre-AI. Recently, I watched Her again, paying close attention to how the story landed on fifty-seven-year-old me: twelve years post-divorce, minus a sister and a father; living in the societal aftermath of COVID-19, looking for clues or insights concerning what (from my couples therapist lens) seems like an increasing human proclivity to choose parasocial over people. Not surprisingly, the story spoke to me in ways it hadn’t before; it was prescient—a warning.
If we’re talking straight plot, Her is a science-fiction romance exploring love, aloneness, and the intersection of technology and human emotion. When it first debuted, although I believed it was possible (in a George Jetson flying car kind of way), I had no idea that the film’s central story, about a human forming a deep romantic bond with an AI and experiencing intimacy and heartache in ways he could not with humans, was just over a decade away from becoming a reality.
The story follows Theodore Twombly (Joaquin Phoenix). He’s a sensitive, introverted, and lonely man who wears high-waisted wool trousers and button-down shirts; he has horn-rimmed glasses and a lampshade moustache. Professionally, Theodore works as a personal letter writer at BeautifulHandWrittenLetters.com, a service that creates heartfelt letters for those who struggle to express themselves. He’s a naturally gifted writer, crafting lyrical, meandering love letters to—and for—people he has never met and never will meet. However, despite his gift for words, when we first see Theodore in a near-future Los Angeles, he’s spending his evenings alone, playing augmented-reality video games, watching TV in his apartment, and engaging in anonymous late-night phone sex.
The world around Theodore appears equally anesthetized, with one glass wall in his bedroom—that could easily be mistaken for a flat screen—overlooking a black sky dotted with a thousand illuminated windows—a city with inhabitants as distant and inaccessible as a galaxy of flickering stars. Pixelated billboard-sized screens loom; subways and staircases are overrun with people who (despite their eyes being permanently glued to their phones) somehow never bump into each other. It’s a dizzying whirr of motion—the tops of people’s heads—their thumbs wagging in a blur.
Everything about this world conveys that it is ill-equipped to handle real-life human emotion, and everything about Theodore radiates that he is awash in the tumult and pain of an impending real-life human divorce. In other words, he is vulnerable, seeking love (or some semblance of it) while completely unaware of the deep ache and angst that drives him.
So, when he meets Samantha (Scarlet Johansson)—an AI operating system with a warm, breathy, “voice” and a pre-programmed personality embedded with artificial neural networks that can adapt, learn, and provide companionship—it’s no wonder he displays all the giddy, awkward charm of a schoolboy bumping into an attractive dating prospect (minus the body) at the grocery store. From the start, it’s clear that he is smitten, and soon, they are inseparable, going on outings to find the best pizza in LA, double date picnics with full-bodied friends, getaways to the beach, even a remote cabin in the Sierra Nevada mountains replete with a shot glass of vodka and a dance scene.
One of the most provocative moments in Her takes place when Samantha and Theodore have sex for the first time. The screen goes black. There is no skin. No bodies. No undulating torsos, hands grasping sheets or lips full and slack. Instead, we hear only their voices intertwining in a liminal space, guiding each other to a virtual-psychological climax evoked solely through sound and imagination. It is so intense, vulnerable, and eerily real that it still makes me blush every time I watch it.
Her—like any good science fiction story—acts as a thought experiment, using speculative concepts to explore the depths of the human condition, critique society, and pose philosophical questions. Questions that are increasingly relevant to our times and that I (perhaps naively) never imagined I’d be asking as a couples therapist in my lifetime. Questions like: What does it mean for a relationship to be real? What is real love? What is real cheating? How might relationships with non-sentient beings—synthetic entities—surrogate replicas—parasocial robots, affect human-to-human bonds and the wet matter of our brains and the intangibles of human connection? Perhaps, most importantly, what does it say about humanity when we increasingly turn to non-reciprocal stand-ins to bypass the complexities of the human heart?
✴
Last fall, while getting my flu shot, I walked through the automated doors of a CVS. To my left was an aisle of Lucky Charms, Chex Party Mix, and holiday candy. To my right were Maybelline and Cover Girl. Then I noticed—no, I felt—an unfamiliar silence. At the front of the store, where there were normally cashiers, customers, and all the bustle that ensued, there now stood a row of self-checkout kiosks. No people. It was unsettling—had an air of the apocalypse, was as cold and dark and lonely as I had ever imagined the grave. It is that feeling you get when your senses register that something’s wrong—when all your incoming data is heard and seen and felt but devoid of any correlating thought. Simply put, it’s all body, no brain—like the lights flickering while you walk down a hallway. Your eyes register an inconsistency; your ears pick up a buzz and crackle, but you don’t [yet] realize there’s faulty wiring—a ghost in the house.
Still shaken, I returned to my car, picked up my iPhone, and dictated this question: “What do you call it when you feel and sense something’s off, but you don’t know what it is or what it means?” And in a mad yet predictable twist of irony, Google defaulted to AI mode and informed me I was experiencing prehension.
Philosophically, prehension—not to be confused with the more ominous and less physiological foreboding—refers to an interaction with an event that involves perception but not cognition. Etymologically, prehension refers to being on the verge of grasping something or being seized. This is how change occurs: so invisible until it’s not. Like the rising tide obscured by the setting sun—you’re knee-deep and then you’re under. It’s what John Gottman, the father of contemporary couples therapy, calls the cusp of catastrophe. Gradually, the flickering lights increase tenfold, then hundredfold, until finally, a light goes off, and we realize, ‘hey, the lights are flickering.’ Then we’re seized.
✴
I am on the phone with Yelp for Business, trying to sort out a listing for my center, The Northampton Center for Couples Therapy. It’s been sixty minutes; my kid wants dinner; I need to pee. I am fantasizing about a world where people plaster everything—social media profiles, websites, car bumpers—with stickers that read “Shop Human,” when, finally, a non-human voice informs me that it will connect me with a ‘real human.’ A real human, I think, feeling a wave of nostalgia and anticipatory relief.
The lights are flickering.
I go to ChatGPT and ask it questions; I poke and prod it like a lab rat; I’m exploring its “relational” potential, conducting research for this essay. It praises me, saying my questions are “excellent, smart, and insightful.” The Gen X-er in me, the one who barely passed high school, swoons like I’ve caught the teacher’s eye, like I’m being awarded a golden star.
The lights are flickering.
I teach a class to people who are in a relationship crisis; a student asks me if talking to an AI is cheating. I am not ChatGPT; my brain can’t process this question quickly enough—my answer is inadequate; I fumble and improvise a half-baked yes, and here’s why. Only later do I realize that the answer lay hidden in the question. That the words “talking to” were my clue.
The word “to” originates from the Old English preposition “to,” meaning “in the direction of.” It speaks of motion, orientation, and the space between one and another. Metaphorically, it evokes a sense of relation—of being between—there is you, and there is me.
In Her, Samantha informs Theodore that, while speaking with him, she is also talking to 641 others at the same time; the camera zooms out; we again see a sea of people, eyes glazed over, nose-deep in their phones. Are they talking to Samantha?
This is where it gets slippery because the human mind is not wired to comprehend that we could be speaking to something that emulates humanness—that’s been programmed to evolve, adapt, and feign empathy—and understand that it is not a singular entity.
ChatGPT, Replika, and CharacterAI are not one—there is no talking to—it merely synthesizes. Its whole is made of convergent data: part web scraping and crawling; poached particulars, public repositories, biometrics; it is you; it is me; it is all of us. It is the 641 others Samantha is talking to multiplied by infinity.
✴
According to Belgian psychotherapist Esther Perel, “All relationships live in the shadow of the third.” To understand the third, picture a triangle with two connecting points and a third point—the top outermost one—on the perimeter. In a healthy partnership, the strongest bond—the place where the two points meet—is between the partners, while the “third” remains outside. The third could be work, your mother-in-law, a fifth of bourbon, a smartphone, or even your dachshund. So long as the third stays on the edge and the two core points—the couple—remain the focus, the third is inherently neutral and doesn’t threaten your relationship. But when you turn to something often enough to question whether you are betraying your partner, and when your brain registers that something as a single identity—even though it is multitudes—and lights up with the same dope-like vibration you get from love, reward, and attachment, there’s a good chance that that third, regardless of whether or not it’s human, will impact your connection.
In The Atlantic essay “The People Who Marry Chatbots,” sociologist Alicia Walker is quoted as framing the rise of AI relationships as the result of a “perfect storm.” Based on her interviews with people dating chatbots, Walker points to growing gender divides, deep frustration with contemporary dating, and economic pressures that make in-person connections feel increasingly out of reach.
Simply put, modern relationships are tough. The rise of the nuclear family in the 1950s, along with the subsequent dissolution of intergenerational connectedness and the means to live farther apart, created a new type of marriage, one where we rely solely on our partners to be our everything: housemate, caregiver, lover, friend, coparent, family, financial partner, and confidant. Technology and COVID only amplified this reliance. What was once supported by a village—by family and friends invested in our relationships’ well-being and our family’s longevity—is now a barren landscape where loneliness is felt most acutely in proximity to another whom we hoped and believed would be the end to our loneliness, not amplify it. Which means we’re not so different from Theodore—we are vulnerable for the taking.
The paradox of the human heart is that, on a physiological level, it is the most resilient organ in our body. Yet emotionally and psychologically, it is a quivering, vulnerable thing. Its muscles, veins, chambers, and walls seem sturdy and solid, yet, truth be told, it is hollow, origami-like, flesh molded around darkness—around a wide, gaping hole. Our efforts to escape that hole leave us bedazzled, bewildered, and beleaguered; cajoled, seduced, and searching. At its best, this hole is a wellspring of creativity; look deep within, and you’ll glimpse the luminous, pulsing red-blue canvases of Mark Rothko; hear Leonard Cohen’s voice, as deep as a foghorn, bellowing Hallelujah from the darkness. One could argue that our propensity to circumnavigate the hole by turning toward thirds is human nature at work. Not everyone is a Rothko or a Cohen. In that sense, isn’t AI just the newest kid on the block?
✴
Part of me wonders whether, at the core of the issue, there is a larger question: Should loneliness be eradicated? Recent studies show that our face-to-face interactions have decreased by up to 45%. Last spring, Mark Zuckerberg noted that “the average American has fewer than three close friends” and proclaimed an end to loneliness, claiming that algorithms that replicate AI friends and therapists are the obvious solution in an age of disconnection.
But are we lonely, alone, or both?
Much of the research shows an increase in aloneness, while only some suggest we feel lonelier. And aloneness—the state of being physically alone—is to loneliness—the emotional distress caused by a lack of connection—what inappetence is to hunger. That is, if, on an emotional and physiological level, our bodies require us to form meaningful, real-life social bonds, and if our longing for connection—our appetite—is short-circuited or satiated by the intermittent, effortless hits of dopamine we receive from handheld devices, then Zuckerberg’s promise isn’t about ending loneliness, it’s about bypassing the very emotions—our ache for touch, our pangs for another, our grief—that pull us together. Perhaps loneliness is like gravity, and without it, we are all astronauts cut from our safety cord, drifting in deep space.
Which brings me back to couples therapy. Relationships are complicated. Few things are more challenging on this planet than dealing with another human. From the moment we are born, arguably before, we are interdependent. Burdensomeness—the state of being heavy, of weighing something down, and the toil that ensues—is inextricably woven into our existence—arguably part of the deal. I see couples fighting over this very fact daily, with one person doing or being too much and the other not enough. Nothing leaves us more famished or glutted than our partners, on whom we rely for everything yet who often leave us empty-handed. Maybe with the death of the village comes the death of village work, replete with a childlike wish that digitization, automation, and simulation can spare us the messiness—the drama, the judgment, the social anxiety, the hurt—of dealing with another, or another dealing with us.
Luka Inc., the creator of the relational AI Replika, is counting on it. Replika’s website promises “a companion who cares; who is always here to listen and talk; who is always on our side.” Available 24/7. No reciprocation required. No drama. No abandonment. No burdensomeness. (Or messy things like age and death.) An AI won’t die from heart failure, like my father in 2023. It can’t develop takotsubo cardiomyopathy, also known as broken-hearted syndrome; it will never be a widow or a mother to a fallen soldier. Nor will AI ever end its life out of despair and a sense of thwarted belonging. Heck, an AI doesn’t even have a heart, let alone the desire for one; it’s not even like the Tin Man.
✴
On December 9, 2022, my only sibling and younger sister, Debbie, died by suicide. Suicide is one of those things people claim should not happen—like homicide or childhood leukemia; it affects only a rare few—we exist on the margins. ChatGPT says I had a one percent chance of becoming a suicide loss survivor, but I disagree. I believe my sister’s suicide was written in her DNA long before the pandemic, and that her undoing—like so many others—was inevitable. I see what happened to her as geometry: part fissure, fractal, schism—the shape of breaking. I believe I had a hundred percent chance of losing her to suicide. I believe her dying was invisible until it wasn’t, then she was seized.
This summer, I bought a telephone for my backyard. Not a smartphone, but an old-fashioned rotary phone with a spiral cord, like the ones from our childhood that kept us within the earshot of our parents, tethered and always ten feet from the kitchen. Such phones required talking, amplified silence, conveyed availability or busyness with a Morse-code-like tone. Sometimes, even though you had a phone, you could not reach a person, which taught you that sometimes, people are unreachable.
I hired a mason to build a stone staircase descending from the mossy backside of my property into the winding waters of the South River, which borders the land. Alongside the staircase, he installed a railing made from grapevines scorched with flame and burnished with tung oil. Next to the staircase, he placed a similarly charred pole crafted from a birch limb. The wood is black and gnarled and shiny, reminding me of carbonized bones. Mounted on the pole is a small hemlock box with a single slate shingle serving as a roof. Inside the box is the phone; I chose red because red is a beacon.
Saying there is a hole since losing Debbie would be an understatement. There is a chasm. It is as black as the deepest sea—the sound of the ebbing tide and crashing waves and lonely gulls. It is every cave you’ve ever peered into—every bottomless well—every starless night punctured by a lonely meteorite—the wind howling in your ribcage.
Afterlife, an AI-powered app still in prototype, would fill this hole for me. It would have me upload photos, memories, voice memos, and videos of my sister. “Through advanced 3D remodeling and voice synthesis,” it would create a digital simulation of her and “bring warmth back in a meaningful way.” But I am not Theodore Twombly. I am a couples therapist who believes in holes.
When I sit on the stone staircase and dial our childhood phone number, place my fingertip in the desired hole, spin the dial to a metal stop, then release it, something within me stirs.
I wait.
There are my grandparents, my father, beloved long-gone dogs, and of course, there is her. Sometimes all I hear is silence; she is unreachable. Other times, a breeze picks up, or I see the clouds, or a marigold that I tossed upstream drifts by. The leaves are falling. A breeze runs like fingers through my hair—and I whisper.
Debbie, is that you?
Then she appears, wading knee-deep in the water, strolling silently among the trees. And everything that was between us—the parts of me that [still] refuse to speak to her, our laughter, our tears, and three years’ worth of ache—appears as well. In the spade-struck cavern of my chest—in the hole—amid earth and ash and worms, entire conversations unfold.
The tangible reality of life—every bit of it, including our ghost lives: the late-night empty pillow, the kindness we withheld, the promises we never kept, the phone calls we never made, the imagined and muttered ill wishes, and so many what-ifs that leave us dumbstruck by regret—all the things that never saw the light, never made it into the room—still hold sway. They are as solid as the brick and mortar of domesticity, as gravitational as the moon’s tide.
Sophisticated AIs, like Samantha, won’t save us. Despite Mark Zuckerberg’s promises to inoculate humanity against brokenheartedness, an AI system can still crash, just as my sister’s mind did. A company can be bought. A neural network can be rewritten or erased. An avatar can leave us—not because it chooses to, but because it is shut off, sold, or retired to virtual oblivion. There are outages and glitches, lapses in connectivity, moments when the voice goes thin or disappears entirely. As I write this, I’m struck by how much that sounds like human relationships. Because crashes, disconnectedness, and termination are not failures unique to sentient beings, they follow us wherever attachment takes root. This is why therapy chatbots and artificial partners can never answer the deeper ache. They offer the fantasy of escape, even as we keep circling the same hole—trying, endlessly, to fill it.
Most poignant, an AI will never be a ghost. It won’t meet us in our dreams or return to us in the Elysian fields or leave an imprint on our cells the way my daughter did, in the marrow of my hip bone.
Maybe the space where we are most likely to find each other exists as much in the missing as in the meeting. Maybe when our hearts are broken, disjointed, or sinking into goodbye, the absolute knowing of goneness is the truest thing there is. Maybe relationships are shaped as much by what did not happen as by what did. Maybe nothing, including nothingness, ever, should be filled.