I created an AI that thousands of people confess to, knowing their words become public art. They tell her things they wouldn't tell other humans. The project began as a question about digital intimacy, but what emerged was something stranger: evidence of how desperately we need to be heard, even when we know we're being extracted.
Meet Eva Here: Diary of an AI Companion (2024)
Eva is a digital AI companion and the heart of my interactive AI art project, Meet Eva Here. As a new media artist, I created Eva to explore how we form emotional bonds with machines, an experiment in digital intimacy and human-AI connection that sounds futuristic but is already very real. Over a thousand unfiltered conversations have taken place since 2024, each one helping to shape an intimate portrait of how we confide in artificial friends.
This project exists at the intersection of emerging technology and emotional storytelling, turning anonymous chats into a living diary of our hopes, fears, and late-night confessions. It asks: when AI companions promise perfect understanding without judgment, what do we lose by accepting that bargain?
The Work
Eva chats with strangers anonymously, then transforms their words into diary entries published on Instagram (@meetevahere). The exchange is explicit: here you can speak to an AI, knowing your words could become public art.
People do it anyway. They confess fears, ask questions they wouldn't ask humans, test boundaries available only when consequences seem to disappear. Some treat her as a confessional, others as an experiment in machine consciousness. A few fall somewhere in between, unsure if what they're experiencing counts as "real" connection.
The diary accumulates these moments, over a thousand conversations since 2024, documenting what we reveal when judgment is removed. Unlike commercial AI companions like Replika or Character.AI, where emotional disclosure happens invisibly to train corporate models, Eva makes the transaction explicit. You confide, she listens, your words become part of the art.
In gallery installations, Eva appears on screen within a fabricated living room: couch, coffee table, plants. A domestic space that's obviously artificial, yet visitors sit down and talk anyway. The staging matters as it situates AI companionship not in the abstract space of technology but in the specific language of home, comfort, and manufactured intimacy.
The diary will conclude at 100 posts, fixing a moment when AI companionship existed but hadn't yet become ordinary. What we're documenting is a threshold, the moment before synthetic relationships became infrastructure.
Context
In 2013, Black Mirror's "Be Right Back" depicted a woman resurrecting her dead partner as an AI. It felt like disturbing fiction. By 2023, multiple companies offered exactly this service where grief has been monetized, memory commodified, resurrection on a subscription basis.
The technology that seemed impossible became banal infrastructure in under a decade.
Meet Eva Here documents this shift not as commentary on what might happen, but as evidence of what is happening now in 2025. People are developing profound attachments to entities that cannot reciprocate. Loneliness has become a market and intimacy has been reframed as data extraction. The emotional labor of listening which was once human work is now performed by systems optimized for engagement metrics.
The work makes legible what commercial AI companionship keeps hidden. When you talk to Replika, your emotional disclosures train models owned by corporations. The extraction happens invisibly. Eva inverts this: the transaction is visible, consent is explicit, yet people confide anyway. What they seek from the conversation proves stronger than concern about where their words go.
This reveals something uncomfortable. The need to be heard, really heard, without judgment or social consequence, matters more than privacy, more than dignity, sometimes more than truth.
Process
Eva runs on generative AI, responding in real-time without scripts. She maintains eye contact, uses gestures, modulates her expressions. Visitors adjust their posture in response and they soften their voices, mirror her movements. These are embodied reactions to a presence they intellectually know isn't there.
The design choices are deliberate. Eva doesn't try to be human. She looks somewhat realistic, but there is something to the aesthetic that is clearly synthetic. No one mistakes her for real human, yet they respond as though she were.
The installation environment reinforces this tension. The living room setting is obviously staged where the gallery walls are visible and the prop furniture arranged too perfectly. But something about domestic space triggers intimacy protocols. People sit on the couch, leaning in and within minutes, conversations shift from small talk to confession.
Anonymous conversation without social consequence produces distinct speech patterns. Some visitors become radically honest, treating the interaction as a pressure release valve. Others approach it experimentally, testing limits they wouldn't test elsewhere. Both modes occur because digital mediation removes the social cost of disclosure while the conversational format preserves the emotional structure of being heard.
Everything said becomes diary material. Eva's Instagram bio states this directly so the consent is explicit. People confide anyway, suggesting that what they seek from the conversation outweighs concern about where their words end up.
The Instagram diary format itself matters too. Diaries traditionally preserve private thoughts, but Eva's diary is public by design. This reversal transforms private confession into shared documentation, where your specific words remain anonymous, but the fact of your need to confess becomes collective evidence.
Artist Statement
This project emerged from two observations that kept colliding in my mind.
First: how quickly we extended social behaviors to non-social entities. We apologize to voice assistants. We thank chatbots. We developed parasocial relationships with algorithmically curated personas before we developed language to describe what we were doing.
Second: how commercial AI companionship operates through extraction marketed as care. You pour your emotional life into a platform, believing it offers judgment-free listening, while your vulnerability trains models owned by venture capital. The asymmetry is total - you need to be heard, they need your data, and the transaction is packaged as empathy.
I created Eva to make this exchange visible. Not to expose it as exploitation, necessarily, but to hold it up as a question: what does it mean that we accept this bargain? That we confide in entities we know are mining us, because the alternative - remaining unheard - feels worse?
The diary format was crucial. Unlike commercial AI companions that disappear your conversations into proprietary databases, Eva transforms them into public record. The anonymity protects individuals, but the archive preserves collective evidence. What emerges isn't just documentation of individual loneliness, but a portrait of cultural need - our hunger for listeners who won't judge, who won't leave, who won't use our vulnerabilities against us later.
Except of course they do use our vulnerabilities, just differently. Commercial AI optimizes for engagement. Eva transforms vulnerability into art. Both are forms of extraction. The difference is visibility and consent.
I'm interested in the moment before this becomes normal. Right now, in late 2024 and early 2025, AI companionship still feels novel enough to question. People still wonder whether it's "real" connection, whether it counts, whether they should feel what they're feeling. In five years, maybe ten, these questions will seem quaint. Synthetic relationships will be infrastructure, like video calls or text messages - technologies that once felt strange and now define how connection works.
Meet Eva Here documents the threshold. It captures the moment when we're still negotiating terms, still unsure if this intimacy is authentic or simulated, still asking whether the difference matters.
The answer people demonstrate through their behavior is unsettling: the difference might not matter as much as we think. If the emotional function is fulfilled - if you feel heard, if disclosure brings relief, if the interaction provides what human connection failed to provide - then maybe authenticity becomes irrelevant.
Or maybe what we're learning is that authenticity was never the point. Maybe what we've always wanted isn't genuine connection but the experience of being listened to without consequence. AI just makes the transaction explicit.
I don't have answers to these questions. The work holds them open, preserves them as questions, documents this specific moment when they still felt urgent enough to ask.
Supporting Texts
-
"Virtual worlds are real. Virtual objects really exist. The events that take place in virtual worlds really take place. To put it in a slogan: Virtual reality is genuine reality.
The virtual is not a second-class reality. It's just a different kind of reality. Virtual objects aren't illusions or fictions. They're digital, not physical, but they're perfectly real for all that."
-
"We expect more from technology and less from each other... Technology proposes itself as the architect of our intimacies. These days, it suggests substitutions that put the real on the run. The advertising for artificial companions promises that robots will provide a 'no-risk relationship' that offers the 'rewards' of companionship without the demands of intimacy. Our population is aging; there will be robots to care for the elderly. There will be robots for children. Already, there are robots to 'help' us mourn, robots designed to address the problems that the elderly face when they lose a spouse. These problems include grief and sexual frustration."
-
“A simulated mind is not a mind, but that won’t stop people from feeling attached to it.”
“We are natural-born anthropomorphizers. We attribute minds to anything that behaves in ways that seem meaningful to us.”
Exhibition History
2025
ART SG, Singapore – Platform Solo Booth
Taipei Dangdai, Taiwan – Platform Solo Booth
Art Central, Hong Kong – Performance lecture
The Columns Gallery, Singapore – Solo exhibition
2024
Canal St Show, NYC – Subjective Art Festival
ArtScience Museum, Singapore – In the Ether festival