close
close

Girlfriend Uses ChatGPT For Couple Fights – Who’s TA?

Girlfriend Uses ChatGPT For Couple Fights – Who’s TA?

“Chat GPT says you don’t have the emotional bandwidth to understand what I’m saying”

Photo by Anna Good

Good Anna

Posted on October 21, 2024 at 11:30 AM CDT

Navigating disagreements in a relationship can be fraught with challenges, from misunderstandings to not being willing to see from the other person’s point of view. That’s hard enough, but Redditor u/drawss4scoress’s girlfriend has decided to add the ChatGPT Language Learning Model (LLM) to the mix as a mediator, rather than asking a friend or therapist for advice.

Featured video

“My girlfriend uses Chat GPT every time we have a disagreement. AITAH to say it has to stop?” the OP wrote in her headline ar/AITAH.

“Me (25) and my girlfriend (28) have been dating for the past 8 months. We’ve had a couple of big arguments and some small disagreements lately. Every time we argue, my girlfriend will walk away and discuss the argument with chat gpt, even sometimes doing so in the same room,” they explained.

“Whenever she does this, she comes back with a well-constructed argument that breaks down everything I said or did during our discussion. I’ve explained to her that I don’t like her to do that, as she can feel like she’s being ambushed with the thoughts and opinions of a robot. It’s almost impossible for a human to remember every little detail and break it down bit by bit, but AI has no problem doing it.”

“Whenever I’ve expressed my discomfort, I’ve been told ‘chat gpt says you’re insecure’ or ‘chat gpt says you don’t have the emotional bandwidth to understand what I’m saying,'” added u/ drawss4scoress. , sharing how uncomfortable it makes them to have a robot regurgitate their girlfriend’s point of view. “My big problem is that she formulates the directions, so if she explains that I’m wrong, she’ll agree without me having a chance to explain things.”

“Am I the *shole to ask you to stop using gpt chat in this context?” they asked

People on Reddit had mixed feelings about using an AI like ChatGPT to try to bolster their side of the argument.

“Reply with ChatGPT until I get the point across,” u/Tangencial-Thoughts recommended.

u/annebonnell iu/SnooMacarons4844 disagreed and thought OP should just break up with his girlfriend, sharing the sentiment of “Who wants a robot for a girlfriend?”

The Reddit comment says:

u/Anangrywookiee/Reddit

Reddit comment that says:

u/Anangrywookiee/Reddit

Other Redditors pointed out that the girlfriend had fallen into the trap of LLMs – which are inherently biased towards end-user input.

Reddit comment that says:

u/Almighty/Reddit

“Show him how biased user input is, he’s literally programmed to tell you exactly what you want to hear. Talk about his actions with ChatGPT from your perspective and he’ll do exactly the same. Show- li how it is biased and only serves as an artificial form of self-validation,” said u/Professional-Ear5923.

Reddit comment that says:

u/Mister2112/Reddit

u/Kopitar4president shared his personal experience with the ChatGPT test, saying: “I realized very quickly that it was programmed to reinforce your position. It’s machine learning to an absurd degree, but all and so on. She asks people to rate the answers. She thinks she’s a robot, but she’s a robot programmed to tell people what they want to hear.”

The Reddit comment says:

u/celticmusebooks/Reddit

The Reddit comment says:

u/kazwebno/Reddit

The Reddit comment says:

u/migrainosaurus/Reddit