Science Fiction, Dystopia, and Psychotherapy in New York: Interview with Paste Magazine
- Lexington Park Psychotherapy
- Jul 17
- 3 min read
Updated: Aug 8

At this point, it may be just as easy to overstate the problems of artificial intelligence as it is to understate them. Most people have a vague sense of optimism, likely (if paradoxically) informed by the dystopian movies in which AI features. The world in the movie Her, for instance, is beautifully designed and contains technology that is seamlessly integrated into people's lives. The fact that the film's protagonist, Theodore, doesn't have to charge the earpiece he uses to speak to his AI girlfriend, that it seems always connected to his phone or computer when he wants it to be, or that it is reliably able to pick up his voice with other ambient noise in the vicinity, is more futuristic than his falling in love with an AI chatbot. Or consider Blade Runner: as bleak as it is, it has flying cars, interactive holograms, and other futuristic conveniences we associate with a technological future.
And technology has produced many great advances. Modern technology has given us the smartphones and social media that have contributed so viciously to the anxiety and depression of so many people as well as life-saving organ-transplant technologies. So there is good reason to be cautiously optimistic about the technology de jour: artificial intelligence. In an interview with Paste Magazine, Jordan Conrad, the founder and clinical director of Lexington Park Psychotherapy discusses the up-sides as well as the down-sides of AI psychotherapy. In "Rise of the Machines: Can Artificial Intelligence Replace Your Therapist?", Jordan explains that “AI has a lot of promise: to reach people in geographically isolated places or those where there just aren’t many therapists.”
That is unequivocally good. As he writes in an article for the Journal of Contemporary Psychotherapy, "roughly 163 million Americans—nearly half—live in federally designated mental health professional shortage areas". Artificial intelligence thus provides a very real and very powerful benefit to millions of people who would get treatment for postpartum depression, couples therapy, trauma therapy, or whatever else but cant simply because nobody is nearby.
As with so many emerging technologies, however, AI psychotherapists appear as though they are being tested on the most vulnerable populations. When asked about the safety of AI therapy Jordan explained that “most mental health apps have shoddy safety information or, in some cases, none at all.”
Paste takes a pretty optimistic view of this. "We’re talking about billions of dollars and some very smart people—won’t they figure out the regulatory oversight stuff at some point?" Paste asked Jordan. “Of course," he responds, "The practical issues will get ironed out. The problems with using AI for psychotherapy are much deeper.” Jordan wants to know how viable AI therapists really are among people who know to a certainty that they do not actually think or feel anything at all.
But there is another problem. Right now most psychotherapy apps use some form of cognitive behavioral therapy (CBT). Jordan says that, while there is no "inherent problem with CBT [...] all outcome studies show some sizable [sic] proportion of people who aren’t helped by the treatment. This is true for CBT just the same as it is for any medication.” Jordan, who is a certified cognitive behavioral therapist in NYC as well as a psychodynamic psychotherapist, finds the idea that the majority of apps will offer only a small number of psychotherapeutic modalities worrisome:
If the advantage of AI psychotherapists is that they can reach treatment deserts, then the fact that they predominantly use CBT means that those conditions CBT cannot treat will disproportionately occur in those areas. That means that wealthy people with access to other treatments won’t have certain conditions that reliably show up in economically disadvantaged places—if anything could widen the already too-large wealth inequality gap, this could.