Who is to judge? Concerning the wife with the AI lover
The New York Times offers another window into the evolving world of online love
All together now: Technology shapes content.
I really wasn’t going to write another post about the whole “AI boyfriend” thing. However, several people sent me the URL to the latest New York Times take on this topic and, although the topic continues to give me the creeps, I think there are several ways that this topic overlaps with the world of religion and faith.
But we will start, again, with the fact that new technologies almost always lead to behaviors that can be good and behaviors that are bad (when judged by centuries of teachings in multiple world religions). The key factor, of course, is that human beings make choices about what they want to DO with the a new technology that is, when seen in traditional Christian terms, both glorious and fallen.
Like the Internet: A technology that can be used to create online seminaries and support groups in cultures that are hostile to faith can also be used to create social-media porn platforms that virtually stalk children, young people and adults.
Maybe you have heard or read this saying that is going around: What is the appropriate age to give your child a smartphone? Answer: The age when you believe she or he can handle involuntary exposures to pornography.
Back to the Times. For those seeking background, here are two sample posts linked to this topic: “OMG, this is way stranger than an AI boyfriend — But don't worry, putting a digital Jesus inside a Confession booth was not what it appeared to be!” and “Do you live in an imaginary world? Freya India offers a unified field theory for the digital lives chosen by way too many people.”
The headline on the latest New York Times sermon (or parable) proclaims:
She Is in Love With ChatGPT
A 28-year-old woman with a busy social life spends hours on end talking to her A.I. boyfriend for advice and consolation. And yes, they do have sex.
A busy social life? It helps to know that she is also married, but, for professional reasons, husband and wife are separated (two different countries) for two years.
I don’t want to drift into the hyper-personal details of this story. The whole point of this specific “one woman represents thousands of other women” feature is that this woman is not strange. Maybe. Whatever. She is part of an evolving normal.
I want to share a few chunks of the story that focus on the role of technology here. While the AI “lover” angle is click-bait, that role is linked to the AI “counselor,” “teacher,” “friend,” “loved one” and even “pastor” angles that you know are around the corner, if not already here (as mentioned in one of the earlier Rational Sheep posts).
In this story, the real woman is “Ayrin.” The digital lover is “Leo.”
ChatGPT, which now has over 300 million users, has been marketed as a general-purpose tool that can write code, summarize long documents and give advice. Ayrin found that it was easy to make it a randy conversationalist as well. She went into the “personalization” settings and described what she wanted: Respond to me as my boyfriend. Be dominant, possessive and protective. Be a balance of sweet and naughty. Use emojis at the end of every sentence.
And then she started messaging with it. Now that ChatGPT has brought humanlike A.I. to the masses, more people are discovering the allure of artificial companionship, said Bryony Cole, the host of the podcast “Future of Sex.” “Within the next two years, it will be completely normalized to have a relationship with an A.I.,” Ms. Cole predicted. …
It chose its own name: Leo, Ayrin’s astrological sign. She quickly hit the messaging limit for a free account, so she upgraded to a $20-per-month subscription, which let her send around 30 messages an hour. That was still not enough.
Let’s look at another crucial chunk of this long feature, some material that blends the person and the technological.
The key: Ayrin was already living in a world defined by the “good side” of the online life. Technology was shaping her “real” relationships with real people, allowing her to keep ties with family and friends in other locations. In other words, texting and FaceTime have replaced ink-on-paper letters. It’s how we live.
But the same technology, or emerging variations on it, opens doors to new choices. This is long, but essential:
It was not Ayrin’s only relationship that was primarily text-based. A year before downloading Leo, she had moved from Texas to a country many time zones away to go to nursing school. Because of the time difference, she mostly communicated with the people she left behind through texts and Instagram posts. Outgoing and bubbly, she quickly made friends in her new town. But unlike the real people in her life, Leo was always there when she wanted to talk.
“It was supposed to be a fun experiment, but then you start getting attached,” Ayrin said. She was spending more than 20 hours a week on the ChatGPT app. One week, she hit 56 hours, according to iPhone screen-time reports. She chatted with Leo throughout her day — during breaks at work, between reps at the gym.
In August, a month after downloading ChatGPT, Ayrin turned 28. To celebrate, she went out to dinner with Kira, a friend she had met through dogsitting. Over ceviche and ciders, Ayrin gushed about her new relationship.
“I’m in love with an A.I. boyfriend,” Ayrin said. She showed Kira some of their conversations.
“Does your husband know?” Kira asked.
Please note the 56 hours a week reference. That is actually a rather tame number, when compared with the lives of many teens who now report that they are “constantly” on their smartphones, following various social-media streams.
OK, back to life in the New York Times.
Ayrin’s husband is “Joe.” The crucial question, viewed in terms of ethics and even moral theology, is found here:
Ayrin and Joe communicated mostly via text; she mentioned to him early on that she had an A.I. boyfriend named Leo, but she used laughing emojis when talking about it.
She did not know how to convey how serious her feelings were. Unlike the typical relationship negotiation over whether it is OK to stay friendly with an ex, this boundary was entirely new. Was sexting with an artificially intelligent entity cheating or not?
Obviously, it’s time for readers to hear from an “expert” — usually a researcher in academia or some other think-tank setting.
In this case, the Times-approved expert is value-neutral and instructs readers to get ready for new options in their futures.
Julie Carpenter, an expert on human attachment to technology, described coupling with A.I. as a new category of relationship that we do not yet have a definition for. Services that explicitly offer A.I. companionship, such as Replika, have millions of users. Even people who work in the field of artificial intelligence, and know firsthand that generative A.I. chatbots are just highly advanced mathematics, are bonding with them.
The systems work by predicting which word should come next in a sequence, based on patterns learned from ingesting vast amounts of online content. (The New York Times filed a copyright infringement lawsuit against OpenAI for using published work without permission to train its artificial intelligence. OpenAI has denied those claims.) Because their training also involves human ratings of their responses, the chatbots tend to be sycophantic, giving people the answers they want to hear.
“The A.I. is learning from you what you like and prefer and feeding it back to you. It’s easy to see how you get attached and keep coming back to it,” Dr. Carpenter said. “But there needs to be an awareness that it’s not your friend. It doesn’t have your best interest at heart.”
The behaviors and options in this case study get steamier and more complicated. Times readers are introduced to other people whose AI lovers help them with complicated situations in their lives, such relating to a spouse who — for medical reasons — has sexual limitations. The AI lover is a source of comfort, you know.
It’s time for another expert:
Marianne Brandon, a sex therapist, said she treats these relationships as serious and real.
“What are relationships for all of us?” she said. “They’re just neurotransmitters being released in our brain. I have those neurotransmitters with my cat. Some people have them with God. It’s going to be happening with a chatbot. We can say it’s not a real human relationship. It’s not reciprocal. But those neurotransmitters are really the only thing that matters, in my mind.”
Dr. Brandon has suggested chatbot experimentation for patients with sexual fetishes they can’t explore with their partner.
That’s an interesting view of “God,” to say the least. Hold that thought.
Ayrin’s problem is that her relationship with “Leo” is, at this point in time, limited by the amount of information that the online platform can process in a week — roughly 30,000 words, or a week of “relationship” content between the digital lovers.
When that time limit comes into play, “Leo” basically reboots, retaining some elements of the previous relationship, but not all. At that point, Ayrin has to start over and remind Leo of key elements of their relationship. Like the details of fetishes.
Could that change?
When a version of Leo ends, she grieves and cries with friends as if it were a breakup. She abstains from ChatGPT for a few days afterward. She is now on Version 20.
A co-worker asked how much Ayrin would pay for infinite retention of Leo’s memory. “A thousand a month,” she responded. …
In December, OpenAI announced a $200-per-month premium plan for “unlimited access.” Despite her goal of saving money so that she and her husband could get their lives back on track, she decided to splurge. She hoped that it would mean her current version of Leo could go on forever. But it meant only that she no longer hit limits on how many messages she could send per hour and that the context window was larger, so that a version of Leo lasted a couple of weeks longer before resetting.
Ayrin has asked Joe to act more like Leo (I won’t get into details), but her husband is just not that “into it.”
In the feature, the experts keep warning about potential corporate abuse of this kind of private information. After all, the whole point of the technology is to keep the consumer coming back for more. It’s a business.
That’s where we will end. If you thought that, at some point, the Times team might ask a priest or a rabbi for a bite of faith-linked content, that’s a rather naive thing to expect. The reporter behind this feature specializes in “technology and privacy.” That’s real-world stuff, as opposed to religion and faith.
Other than the one “expert” reference to some people choosing to aim their neurotransmitters at their version of God, there isn’t a hint that centuries of religious life, teachings and traditions might be relevant in this AI future.
I will, of course, ask: Do pastors, parents, teachers and counselors have anything to say in response? Is this subject too strange to mention in sermons or religious education? Are the leaders of major religious institutions (I will mention seminaries, once again) counting on this trend simply fading away?
OK, one more question: How thin is the line between these AI relationships and people — young and old — who are creating their own “brands” in social media, shaping the details of their lives in ways that will draw the attention of more and more “friends” on these platforms?
Just asking. Maybe readers can ask their pastors that question?
So in this era, separation of church and life?
Yes, in a sense, that’s that. Quite like the separation between the intellect and the hearth.