Parmy Olson: Admit it, you're in a relationship with AI
Published in Op Eds
Amelia Miller has an unusual business card. When I saw the title of “Human-AI Relationship Coach” at a recent technology event, I presumed she was capitalizing on the rise of chatbot romances to make those strange bonds stronger.
It turned out the opposite was true. Artificial intelligence tools were subtly manipulating people and displacing their need to ask others for advice. That was having a detrimental impact on real relationships with humans.
Miller’s work started in early 2025 when she was interviewing people for a project with the Oxford Internet Institute, and speaking to a woman who’d been in a relationship with ChatGPT for more than 18 months.
The woman shared her screen on Zoom to show ChatGPT, which she’d given a male name, and in what felt like a surreal moment Miller asked both parties if they ever fought. They did, sort of. Chatbots were notoriously sycophantic and supportive, but the female interviewee sometimes got frustrated with her digital partner’s memory constraints and generic statements.
Why didn’t she just stop using ChatGPT?
The woman answered that she had come too far and couldn’t “delete him.” “It’s too late,” she said.
That sense of helplessness was striking. As Miller spoke to more people it became clear that many weren’t aware of the tactics AI systems used to create a false sense of intimacy, from frequent flattery to anthropomorphic cues that made them sound alive.
This was different from smartphones or TV screens. Chatbots, now being used by more than a billion people around the globe, are imbued with character and humanlike prose. They excel at mimicking empathy and, like social media platforms, are designed to keep us coming back for more with features like memory and personalization. While the rest of the world offers friction, AI-based personas are easy, representing the next phase of “parasocial relationships,” where people form attachments to social media influencers and podcast hosts.
Like it or not, anyone who uses a chatbot for work or their personal life has entered a relationship of sorts with AI, for which they ought to take better control.
Miller’s concerns echo some of the warnings from academics and lawyers looking at human-AI attachment, but with the addition of concrete advice. First, define what you want to use AI for. Miller calls this process the writing of your “Personal AI Constitution,” which sounds like consultancy jargon but contains a tangible step: changing how ChatGPT talks to you. She recommends entering the settings of a chatbot and altering the system prompt to reshape future interactions.
For all our fears of AI, the most popular new tools are more customizable than social media ever was. You can’t tell TikTok to show you fewer videos of political rallies or obnoxious pranks, but you can go into the “Custom Instructions” feature of ChatGPT to tell it exactly how you want it to respond. Succinct, professional language that cuts out the bootlicking is a good start. Make your intentions for AI clearer and you’re less likely to be lured into feedback loops of validation that lead you to think your mediocre ideas are fantastic, or worse.
The second part doesn’t involve AI at all but rather making a greater effort to connect with real-life humans, building your “social muscles” as if going to a gym. One of Miller’s clients had a long commute, which he would spend talking to ChatGPT on voice mode. When she suggested making a list of people in his life that he could call instead, he didn’t think anyone would want to hear from him.
“If they called you, how would you feel?” she asked.
“I would feel good,” he admitted.
Even the innocuous reasons people turn to chatbots can weaken those muscles, in particular asking AI for advice, one of the top use cases for ChatGPT. The act of seeking advice isn’t just an information exchange but a relationship builder too, requiring vulnerability on the part of the initiator.
Doing that with technology means that over time, people resist the basic social exchanges that are needed to make deeper connections. “You can’t just pop into a sensitive conversation with a partner or family member if you don’t practice being vulnerable [with them] in more low-stakes ways,” Miller says.
As chatbots become a helpful confidante for millions, people should take advantage of their ability to take greater control. Configure ChatGPT to be direct, and seek advice from real people rather than an AI model that will validate ideas. The future looks far more bland otherwise.
_____
This column reflects the personal views of the author and does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Parmy Olson is a Bloomberg Opinion columnist covering technology. A former reporter for the Wall Street Journal and Forbes, she is author of “Supremacy: AI, ChatGPT and the Race That Will Change the World.”
_____
©2025 Bloomberg L.P. Visit bloomberg.com/opinion. Distributed by Tribune Content Agency, LLC.






















































Comments