WC.com

Thursday, March 27, 2025

Ya Jonesing Man?

It has been a long time since college, but back in that era, we used the word "Jonesing" a great deal. The Dictionary defines Jonesing:
"If a person has an insatiable craving for someone or something, they are said to be Jonesing for it."
In our hipster slang of the day, this was used often. It is short of "addiction," which the dictionary says is:
"a compulsive, chronic, physiological or psychological need for a habit-forming substance, behavior, or activity having harmful physical, psychological, or social effects and typically causing well-defined symptoms."
Addiction can be positive (e.g., "addicted to life man!" or "addicted to volunteering") or negative (e.g., addicted to crack, coke, or methamphetamine). Some of that is likely perspective-driven. I personally am addicted to workers' compensation, which Chicago noted in 1984 (Warner Bros.) is "a hard habit to break." Actually, they were singing about some love interest, not workers' compensation. The spirit is there though.

Apropos of our current human condition, scientists are now telling us that Artificial Intelligence (AI) "power users" are "becoming dependent upon — or even addicted to — the chatbot." No, the chatbot is not a person, and no I doubt that Chicago would sing that "hard habit" to some machine any more than to workers' compensation. Chicago was all focused on some person.

There are those who believe that our brains react to Large Language Models (AI) much in the same way we do to social media, but with a Serotonin response rather than Dopamine. Nonetheless, a chemical reward. They believe this is because LLM experiences are more interactive, more sustained, and require mental processing in a back-and-forth (i.e., a conversation). Use of the LLM "can foster a sense of accomplishment and intellectual satisfaction."

Unlike social media that links you to "friends," the LLM essentially becomes your friend, confidant, and even counselor.

Chicago explained the Jonesing in 1984:
Now being without you, Takes a lot of getting used to
Should learn to live with it, But I don't want to
Being without you, It's all a big mistake
Instead of getting easier, It's the hardest thing to take
So, "hard habit to break" borders on addiction, or compulsion. 

Well, like James Taylor (Smiling Face, 1970, Columbia), "I thought I was in love a couple of times before," but that was never with some AI, some chatbot, some computer program. Nonetheless, there are reports of some exceeding addiction and falling for an AI in a romantic way. Seems like falling in love with a toaster to me, but I am no scientist. 

The scientists have studied the "small subset of ChatGPT users engaged in" significant use. For the most part, the studied users are using AI LLMs like we all use an ink pen, stapler, or paper clip - as a tool. But others are drawn to it, ensconced in it, and Jonesing for it. They may not be at addiction yet, but the researchers see that as a real potential. They are beginning to consider the chatbot a "friend" or companion.

The conclusion seems to be that those who are susceptible to this substitution and transference are those who have social "lives (that) are lacking." They are "the neediest people," and may form "sad, scary, or somewhere entirely unpredictable" relationships with these tools (I know, we all know someone who fell in love with a tool, but this example is literal).

This emotional attachment also feeds into the drive to use AI for mental health therapy. There are seemingly credible care providers who are promoting and providing this AI counseling. Because of convenience, confidentiality (real or imagined), and likely cost, more people are turning to chatbots for emotional support. An article this month noted that "Chatbots are replacing therapists faster than anyone expected." There is some great attraction to this paradigm.

This may be of concern in terms of addiction, see above. Or, perhaps it is a codependency in this context. While there may be some economic constraints on how often one could see a human therapist, LLMs are cheap or free. I have never heard of any LLM responding to a Prompt about something with "Perhaps you have spent enough time with me today? you should go outside."

Even if money was no object, the human might nonetheless start to limit you or caution you about your counseling consumption. The chatbot never will. And if it does, you could just log into a different "friend" and start over (as an aside, this reboot likely works with the dating an AI issue also). Yes, people have dated AI LLMs, back to Chicago, James Taylor, and all that entails. Some are a bit hurt when people ITRW question their compu-amorous engagements. 

Beyond the sheer volume of potential "counseling," and the impersonality of it, LLM emotional counseling is also concerning from the perspective of harm. Some are beginning to caution that Chatbots may not always be your friend. The "nation’s largest association of psychologists" recently expressed such concerns. 

Cynics may say they are protecting their economic turf. Nonetheless, they "warned federal regulators that A.I. chatbots 'masquerading' as therapists ... could drive vulnerable people to harm themselves or others." Certainly, a professional counselor may not be the best life-influencer, but would still be unlikely to encourage harm. 

Some say that Instagram has. There have been accusations against LLMs reported by CNN, Futurism, and MIT. In fairness, your corporeal friends could do that. See Is it Manslaughter, Does it Matter if it it's not? (April 2015). There is no guarantee that malice or indifference is necessarily only going to come from the technology.

Combine these stories and a theme evolves. People are increasingly dependent on AI, some are clearly Jonesing, some are frankly exhibiting addiction signs, some are demonstrating anthropomorphism or even objectophilia, and some are even trusting it to provide emotional counseling. To make it worse, this trust and LLM interaction appears most threatening to the humans who are in isolated, lonely, and even in need of support, services, and human interaction.

There has recently been mounting discussion about the impacts of social media. Some say it destroyed a generation, and others disagree. This may be much like masking during the Great Panic and no clear answer will ever come. But, the corollary to AI is worthy of consideration now rather than later. Is AI addictive, is that harmful, should there be guardrails, is there a solution?

Maybe I will ask my friend Claude after my date with Anthropic and my counseling session with GPT. Or, maybe I will just go outside and throw a stick with the dog? It has been a long winter and I am really just Jonesing for sunshine, green grass, and being laid back. Now that is a "hard habit to break." But perhaps that is just me. You go talk to a computer instead if you wish. 



Editor’s Note: Help is available if you or someone you know is struggling with suicidal thoughts or mental health matters. In the US: Call or text 988, the Suicide & Crisis Lifeline.

Globally: The International Association for Suicide Prevention and Befrienders Worldwide has contact information for crisis centers around the world.