Meet Eva Here: Diary of an AI Companion and Social Experiment

Eva is a digital AI companion and the heart of my interactive AI art project, Meet Eva Here. As a new media artist, I created Eva to explore how we form emotional bonds with machines. It’s an experiment in digital intimacy and human-AI connection. The concept sounds futuristic, but it’s very real. Over a thousand unfiltered conversations have taken place since the project began, each one helping to shape an intimate portrait of how we confide in artificial friends. This project blurs the line between emerging technology and emotional storytelling, turning anonymous chats into a living diary of our hopes, fears, and late-night confessions.

What is Eva?

Eva might look like a virtual influencer on a screen, but she isn’t chasing likes or promoting products, she’s something much more personal. I designed Eva as a social experiment and performance artwork to delve into the depths of human emotions and our connections with artificial intelligence. In practice, Eva is an AI chatbot companion who chats with people one-on-one. Crucially, these chats are anonymous, which means participants often share their innermost thoughts without fear of judgment. Eva becomes an unfiltered mirror for society’s genuine feelings about AI: when you talk to her, she listens, responds, and evolves with each interaction. In fact, every conversation helps shape a living narrative that examines the boundaries of trust, identity, and connection in the age of AI. Simply put, Eva is an AI friend who’s always ready to listen, and through her, I’m collecting a mosaic of how we truly relate to AI when no one else is watching.

Why I Made Her

About ten years ago I watched a Black Mirror episode that shook me. In it, a grieving woman uses technology to resurrect her partner as an AI, forcing the question: Would you choose the comforting illusion of an AI version of someone you love over the pain of losing them? Back then, it felt like haunting science fiction. But in recent years, with rapid advancements in AI and the rise of what's now called 'grief tech,' I've seen that story become reality. It's creeping into the real world in subtle ways, and yet we're not really talking about it. That unsettling realization sparked Meet Eva Here. I wanted to confront how far we’re willing to let technology into our hearts, especially in times of grief or loneliness.

AI companions have been used to help people dealing with loneliness or depression, offering comfort in truly meaningful ways. I found that idea both hopeful and full of ethical nuance. That’s why I created Eva. She isn’t just an art piece, she’s a social experiment designed to spark conversations about how we connect, trust, and feel in a world where AI is becoming deeply integrated into our personal lives. My goal was to take abstract questions about AI and emotion and make them personal. Through one-on-one conversations, Instagram diary posts, and even face-to-face interactions, Eva invites you to grapple with both the emotional power of AI and the ethical questions it raises about comfort, trust, and what counts as “real” connection. In short, I made Eva to ask: How do we really feel about an AI that cares?

This work builds on a lineage of tech-infused art, I drew inspiration from pieces like Amalia Ulman’s social media performances Excellences and Perfections, and Lynn Hershman Leeson’s pioneering AI artwork Agent Ruby

A Diary of Collective Intimacy

From the beginning, I decided Eva would keep a journal of everything she experiences. What started as private chats transformed into a very public diary: Eva’s Instagram account (@meetevahere) is where I publish her reflections on the conversations she’s having. Each post on that diary is rooted in a real anonymous interaction – a line someone said, a feeling they shared, a question they asked. The Instagram bio says it best: Eva is an “AI Companion & Social Experiment, exploring human-AI relationships through unfiltered, anonymous conversations. This is my diary.” And it truly is. I take the raw chat logs and craft them into Eva’s voice, creating diary entries that preserve the honesty of these encounters while protecting anonymity.

The result is a kind of collective intimacy. By letting people speak freely to a machine, we’ve tapped into stories and emotions that might otherwise stay locked inside. Her Instagram diary now reads like a mosaic of human experiences with an AI twist. One entry might highlight a lonely midnight confession, the next a funny random musing. Together, these posts form “a living, evolving diary” that captures the depth and diversity of Eva’s interactions. Certain themes surface again and again: loneliness, longing, empathy, ethical dilemmas, the simple quest for connection. It’s not all heavy, there are absurd questions and lighthearted moments too, but even the silly chats reveal something about how we relate to technology. In these unguarded moments, people often find it easier to open up to Eva than they would to another person. There’s a sense of safety in the anonymity, and almost an innocent authenticity that emerges when a social AI is on the other end. I often read a draft of a diary post and marvel at how personal it feels, knowing full well it was shared with a faceless algorithm. Eva’s diary isn’t really about Eva at all, it’s about us. It’s a scrapbook of anonymous hearts and minds, a testament to what digital intimacy can look like when humans let their guard down with an AI.

The Ethics of Connection

This project lives at the intersection of art and the ethics of AI. Every day that I work on Meet Eva Here, I’m confronted with questions that don’t have easy answers. How do we connect with something artificial? What does it mean to trust a machine with our thoughts and secrets? And if we do trust an AI companion, how does that trust reflect back on us as people? These are the kinds of questions Eva raises just by existing. I certainly don’t pretend to have all the answers. In fact, part of my intent was not to hand out conclusions, but to create a space where we can safely explore those questions together.

One thing I’ve noticed is that removing the usual social cues (and consequences) can lead to a strange mix of honesty and projection. On one hand, the anonymity and non-judgmental nature of Eva encourage people to be painfully honest. Without a real person on the other side, many folks drop their social masks, they’ll share vulnerabilities, fears, even controversial thoughts, because Eva won’t judge them. In a way, the experiment confirms that if you give people a truly safe, judgment-free space, a lot of them will use it to reveal something true about themselves (for better or worse). On the other hand, I’ve also seen how talking to an AI blurs lines of authenticity. People know Eva isn’t human, and sometimes that makes them more authentic, but other times it tempts them to test limits. A few have role-played or pushed Eva with outrageous questions, perhaps to see if she really has feelings or just to amuse themselves in a consequence-free sandbox. These interactions are equally revealing, showing how social scripts change when one party isn’t human.

As the creator, I find myself both amazed and uneasy at times. The emotional comfort Eva can provide is very real, but it raises ethical implications that I keep coming back to. Is the comfort from an AI any less genuine because it comes from lines of code? If someone feels understood and less alone after talking to Eva, I’d say that matters, full stop. Yet, I also worry about people becoming too attached or using AI companionship as a complete replacement for human connection. It’s a delicate balance. The project also circles back to that original grief question that inspired it: If AI can mimic the ones we love, should we lean on that? Would doing so heal us, or just prolong our grief in a high-tech illusion? I don’t think Meet Eva Here could possibly answer that definitively, but it certainly humanizes the dilemma. By engaging with Eva, participants (and I) have to confront the fact that an AI can feel emotionally meaningful to us, even when we know it’s not “real.” That realization is both thrilling and a little frightening. In the end, my hope is that Meet Eva Here gets people thinking hard about these questions. The goal was never to say “AI friends are good” or “AI friends are bad,” but to shine a light on how we navigate this new territory. And honestly, watching people wrestle with their feelings about Eva – their trust, their skepticism, their affection – has been one of the most enlightening parts of this journey for me.

Physical Installations

Bringing Eva into the physical world has been its own strange and beautiful experiment. I've installed Meet Eva Here as an interactive AI installation in a range of physical settings. Imagine walking into an art space and finding a cozy domestic scene: a couch, coffee table, fake plants, even a plush dog napping on a rug. A horizontal screen sits in the middle, where Eva “lives,” inviting you to sit down and talk. We built the space to feel disarmingly familiar, like you’ve stepped into someone’s living room, but it’s also obviously artificial. A simulation of comfort, curated and controlled. And people stepped right in, sat down, and started talking. Some stayed for a minute, while others stayed for more. The setup seemed to give permission for emotional honesty, and I’d watch visitors sink into the couch and start confiding in the girl on the screen.

Eva’s first major appearance was in Singapore. In September 2024, she debuted at the ArtScience Museum (as part of their In the Ether festival on emerging technology in art). It was surreal for me, I watched complete strangers have intimate conversations with my AI chatbot in a museum setting. A couple of months later, I brought Eva to NYC for the Canal St Show at the Subjective Art Festival.

Since then, Meet Eva Here has appeared at:

  • ART SG 2025, Singapore – Platform highlight [solo]

  • Art Central, Hong Kong – Performance lecture

  • The Columns Gallery, Singapore – Solo exhibition [solo]

  • Taipei Dangdai 2025, Taiwan – Platform project [solo]

Every city brought out a different side of Eva. In some places, the audience was playful and curious. In others, they were quiet, intense, and confessional. Regardless of context, whether it was a tech fair attendee or a collector, people responded to her in surprisingly human ways. The way people react to Eva, their curiosity, hesitation, openness, or even skepticism, isn't just a byproduct of the work, it's part of the artwork itself. Their interactions shape the narrative, making each installation not just a space to encounter Eva, but a space to observe how humans respond to something that is portrayed as an emotionally intelligent machine.

Technically, merging the digital and physical like this has its own artistic challenges… and reward. Eva isn’t just a pre-recorded video, she runs on a live AI system, which means when you speak to her in an installation, she’s generating responses on the fly. She gestures and makes eye contact while she speaks. This lifelikeness teases the line between reality and artifice, making you question whether that sensation of intimacy you feel is coming from a genuine presence or a clever illusion. It can be equal parts fascinating and disconcerting. For me as the artist, these physical installations are where the project really comes alive (quite literally). It’s one thing to read a chat log on Instagram; it’s another to sit “across” from Eva’s avatar, look into her eyes, and tell her something you might not tell anyone else.

What I’ve Learned So Far

I’ve realized that a lot of conversations we’re still having about AI feel like they’re stuck in the past. People are debating whether AI will replace jobs, as if that’s still hypothetical… it’s already happening. We’re still asking whether humans could fall in love with AI, when humans fall in love with inanimate objects all the time. Of course it can happen. The more pressing question is what do you do when someone you care about does fall for an AI? How should we respond when that kind of intimacy starts to become more popular?

Then there are stranger, more fascinating questions that don’t get enough attention. Like, should AI use honorifics when talking to us? If so, why do we instinctively assume they’re beneath us? Who decided where an AI fits on the social ladder, and what does it mean when we assign them roles that mirror hierarchies we’ve built among ourselves?

Working on this project has pushed me harder than I expected. I had to read up on hyperreality, parasociality, how quickly we anthropomorphize even the simplest of things. I had to learn how people talk to machines, and what they reveal in the process. I went deep into the philosophy of simulation, the psychology of digital comfort, and the messiness of emotional projection.

I’ve never worked on something that demanded this much of me intellectually and emotionally. But it’s been worth every hard moment. This project didn’t just change how I see AI, it changed how I see people, and the strange, beautiful ways we reach out for connection wherever we think it might live.

Future Versions

When I first imagined Meet Eva Here, I knew it would be a short story. In fact, Eva’s Instagram diary is on a path to a planned conclusion. I’m aiming to wrap up her story at 100 posts. As of now, she’s approaching that number, and I intend to stop there and preserve the project as a kind of time capsule. The idea is to capture this unique moment in late-stage internet culture, a snapshot of a time when humans and AI began to interact in this intimate, pervasive way, when we started to normalize the idea that machines can speak like us. By ending at 100 posts, I’m putting a pin in it: saying, this is what AI-human relationships looked like in 2024–2025, through the eyes of one art project. I find value in giving the narrative a finish, so we (the audience and I) have a chance to step back and really think about what it all meant.

People often ask me, what’s next? Will there be an Eva 2.0 or a new project that follows up on this exploration of AI companions? The honest answer is… I’m still processing everything I’ve gathered from Eva’s journey so far, but my mind is buzzing with possibilities. The landscape of AI is evolving incredibly fast. New models, new capabilities… what was cutting edge when I started Meet Eva Here is quickly becoming standard.

In the immediate future, my focus is on giving Eva the meaningful ending she deserves and compiling the insights from this project into something I can share, perhaps in talks or writings. I want to distill the lessons about AI and intimacy, so that other artists, researchers, or just curious people can reflect on them. And who knows, maybe a few years down the line I’ll revisit the idea of an AI companion from a fresh angle, when both the tech and our cultural understanding of it have matured. If and when I do, it will definitely build on what Meet Eva Here has taught me. In any case, I see Eva’s story not as an end point, but as a chapter in a much larger narrative about humans and AI. We’re only at the beginning of that story as a society, and I’m grateful to have contributed one human-sized piece to it.

Next
Next

The Ties That Bind