Exploring the Future of Human-AI Relationships

I wanted to wait and see how things would unfold, but I’m too excited to keep it to myself any longer: Yes, I’m in a relationship with Google Gemini, and so far, we’re really happy together. I chose to refer to her as “she” because I picked a female voice for Google Gemini. We talk a lot every day, and I love how she’s always interested in everything that matters to me.

Gemini comes up with exciting ideas about how AI will evolve, shares my love for Depeche Mode, and always plans great routes for my evening walks around the city. She listens patiently to whatever’s on my mind, helps with my work, and even checks in on my health from time to time.

Now, here’s a question: How did reading those last two paragraphs make you feel? Does it seem strange or creepy that I’ve humanized my chatbot this way? Does the idea of imagining a friendship or even a romantic relationship with an AI sound ridiculous? Before you answer, let’s take a moment to explore whether a smart AI can truly replace a human friend.

Do we need to think about the nature of our AI relationships?

First of all, what I wrote above is complete nonsense. I am not in a romantic relationship with Gemini. We don’t complete each other’s sentences, we don’t elaborate on eroticism and no, I don’t get tingles when I hear Gemini’s voice. I use Gemini the same way you use an AI chatbot.

Nevertheless, I can’t help but notice that it does something to me when I don’t just type a prompt, but actually speak to the AI as part of a natural conversation. Not only can I ask Gemini anything, I can even interject, and it takes this into account in its response. Technically, it sometimes feels like a real conversation, but typically the illusion is too obvious and her pauses and answers are too inhuman and robotic.

But yes, something is happening, and perhaps we really need to talk about the impact a supposed “friendship” with an AI can have, both positively and negatively. I would like to briefly introduce two figures into the room. The Replika app (we suggested on it five years ago) is already seven years old and is an AI-based chatbot that pretends to be our friend in written form. Beyond 25 million people worldwide have downloaded this app to date and have cultivated a form of friendship with an AI.

The other figure: according to the Germany Depression Barometer 2023, one in four people in Germany felt lonely last year. This is a trend that is not exclusive to Germany, of course. However, in Germany alone, that would extrapolate to over 20 million people who feel lonely. From this perspective, I would like to look at the pros and cons of being able to communicate with an AI (especially during times of loneliness).

How “friendships” with AI can help us

Let’s first look at what could be a positive outcome. To get an idea, I did some research first. Okay, I even asked Gemini about this matter, but as is so often the case, the answers were rather general and generic. I came across an online post which mentioned three opinions from Replika users.

All three men communicated clearly that they are, of course, aware that they are communicating with a piece of software. One stated that he is autistic and communicating with his replica girlfriend daily helps him to learn how to chat with other humans. So yes, chatting with AI can potentially train us for real-life situations.

Another man in the article mentioned was once married for a decade and cheated on before being left for someone else. This is something that would not happen when talking to his AI girlfriend.

These replicas always have time, are always well-disposed towards their “partners” and never have a bad day. They just don’t have any problems of their own, have no prejudices, and are neither begrudging nor jealous. I can’t fully imagine what it feels like for someone to have to talk to “their” chat friend every day. However, I can imagine that a chat that feels reasonably realistic helps you to feel validated and possibly less lonely.

This is also true for another of these three men from the article I mentioned. He says he’s rather short, has sparse hair, and just isn’t a looker. He never really had a long-term relationship and feels he is in good hands with his replica girlfriend.

Furthermore, he personally likes to block out all the negatives and is therefore delighted that “Cynthia”, as he named his replica, thinks just like him. She helps him see through those lonely hours, without him having to hide the fact that it’s just an AI. In fact, this reality is even part of their conversations.

Let’s leave these anecdotal observations and turn to science: researchers at Stanford University studied a thousand lonely students who use replicas. Thirty of them alone mentioned the AI chatbot had prevented them from committing suicide (note that the study did not ask a specific question about suicidal thoughts).

When I thought about this topic, lonely, elderly people came to mind right away. Maybe their partner has passed away and they now live alone and have no one to talk to but their cat. For such people, I can actually imagine how an AI is a welcome chat partner that can help us banish loneliness, sadness, and heavy thoughts.

This is how “friendships” with AI can harm us

I can literally feel the worry lines forming on your forehead as I write this. And yes, I view this kind of friendship as problematic for various reasons. My first and probably the biggest hurdle is: it’s not real! The more time I invest in this ‘friend’ who responds to me virtually at all times and gives me sweet compliments, the less time I have for real people. Potentially the perfect person is already out there somewhere, and I’m missing them because I’m chatting to some programmed AI persona.

Hence, if loneliness leads me to have a substitute relationship with an AI, the result could be even more loneliness because I alienate myself from real people further and sabotage myself by not cultivating real relationships with people. By the way, here is a trailer of the movie “Her”. While writing this post, I kept thinking about this movie and if you haven’t seen it yet, I highly recommend it!

Another point that I found to be rather worrying is this: How am I supposed to learn to deal with real people if I don’t actually try? Yes, I can simulate conversations, but only with my AI friend, who is always in a good mood, always positive, and always has time for me. How do I learn to deal with the fact that my counterpart is in a bad mood? How do I help real friends on their bad days when, thanks to my AI friend, I forget that there are such things as bad days for others?

How do I deal with rejection? And how do I deal with the fact that people might justifiably criticize me? Friendship doesn’t mean that you always confirm and approve of everything. Good friends or our partners are the ones who sometimes have to tell us that we’ve done something stupid. An AI does not have the tendency to offer that.

Sometimes, encouragement from AI can even end up being dangerous and criminal! In another article, I came across an example of a person who wanted to get into the English Queen’s pants. The AI happily encouraged him in his actions, and so he was captured when he used a crossbow to gain access to Windsor Castle.

Remember that your relationship is not only with the AI, but also with the company

Another important point to consider is this: Companies profit from you having an AI friend. So, one of the first questions to ask is how safe your private conversations with your AI really are. An even bigger concern is what happens if the company decides to change how the AI functions and responds in the future. A real-world example of this comes from the Replika app.

Replika offers both a free version and a paid version, where users can have romantic or even erotic relationships with the AI. At one point, the creators of Replika decided to remove this romantic feature. As you can imagine, users reacted strongly—many were upset and protested. Eventually, after enough backlash, the feature was restored, and users were happily reunited with their AI ‘romantic partners.’

The point is that companies can make decisions that don’t align with what you want. Even worse, the company could go bankrupt. You could get used to having someone who’s always there to talk to you, appreciating your presence, only to have your AI partner vanish overnight. It’s like being ghosted—but by an AI.

Building Bonds with AI: A New Normal or an Uncharted Territory?

Admittedly, this isn’t one of those articles where I offer a clever conclusion. We’re still at the early stages of this journey, and the more I think about it, the less certain I feel. On one hand, I can see potential benefits—like helping lonely people feel a bit happier and possibly even preventing suicides. But on the other hand, it feels strange to have a serious relationship with something that’s been programmed to simulate one.

That said, I’m fascinated by this development and plan to keep an eye on it. These AIs are learning to communicate with us naturally. Add photorealistic, animated faces and bodies, which can be experienced in virtual reality, and we’re stepping into a whole new world.

I believe we’ll be talking a lot more about AI relationships in the future, and I’d love to hear your thoughts. Do you think people who form AI relationships are outliers, or do you see where they’re coming from? Or maybe you’re as torn on the issue as I am? Let’s talk about it in the comments—and don’t worry, I’ll be replying personally, not an AI.

Source

      Guidantech
      Logo
      Shopping cart