What steps would you take to access therapy? Admitting that there’s a need for it, seeking out a therapist, and scheduling an appointment can all seem overwhelming to someone struggling with a mental disorder. Now imagine that a new technology allows people to access therapy conveniently anytime and anywhere that they have Internet access. With artificially-intelligent therapists, this new access to mental healthcare could become a reality. Born of sites like Web MD, therapy forums, and the self-help content already available online, the AI therapist is designed to make mental healthcare more widely accessible.
The AI therapist is accessed over the Internet from any phone, tablet, computer, or other web-enabled device. It provides talk therapy by listening to the user’s voice and responding intelligently in a human-sounding voice, according to pre-programmed psychological knowledge. The software that determines the AI therapist’s responses is written by a conference of psychologists in order to decide the optimal treatment for different disorders.
The AI therapist was designed by researchers and developers at the University of Maryland. It was intended to be made available for free on the Internet in order to make mental healthcare more accessible. Similar privatized AI therapy services are being introduced that may threaten the easy availability of web therapy.
Several problems are presented by the development of the AI therapist. One is an inherent design flaw: although the therapist’s voice may sound convincingly human, anyone interacting with it will be aware that they are not actually speaking with another person. As we discussed in class, patients might feel uncomfortable talking with the AI if they feel that it is not truly able to empathize with them because it lacks human experiences. Other problems that arise with the development of the AI therapist have to do with labor. For one, the AI would be able to replace many therapists and force them out of jobs. Actual human therapists with in-person sessions may be reserved for only the wealthiest people and come at high prices, leaving AI therapy to the majority of patients. Another issue that arises is related to ethics and AI. Do we consider the AI therapist a person? Is it self-aware, and does it need to be compensated for its labor? For all the good it could do, the AI therapist presents issues when it comes to effectiveness and ethics.
The AI therapist both challenges and reinforces social divisions. For one, it challenges social divisions in terms of the unequal access to therapy and mental healthcare between social classes. AI therapy could make mental healthcare accessible to a greater number of people, expanding its reach beyond the current privileged few. On the other hand, we cannot assume that making the AI therapist available free of charge over the Internet will make it available to everyone. Patients need to have some wealth in order to access AI therapy; for example, they need an internet-accessible device. Additionally, people would need to have access to the Internet in their homes in order to use the AI therapist. The practice of going to a library, coffee shop or restaurant with wifi would most likely not work with AI therapy because people would probably not be willing to share their thoughts and feelings openly with their therapist in a public space. For this reason, AI therapy will probably be limited to people with Internet access in their homes. And just having Internet access at home does not guarantee that users will feel free to share their thoughts and feelings with the AI therapist; people will probably also require privacy within their homes to make use of the technology. For these reasons, AI therapy could reinforce the inequalities present in accessibility to therapy depending on wealth and class.
Another way that AI therapy might, unfortunately, reinforce social divisions is by fitting in to the narrative that mental healthcare is something to be ashamed of and kept private. While it’s great if more people get access to therapy due to the AI therapist because they feel more comfortable sharing with a computer, the nature of the technology also allows people to get mental healthcare more covertly. In so doing, it circumvents the need to fight against the systems of power that tell us to hide use of mental healthcare. Though the AI therapist might tend to reinforce the stigma surrounding therapy, the spread of therapy to a wider public that might result from its introduction could fight that stigma.
The AI therapist has the potential to change how society views mental healthcare. Instead of being an elusive service reserved for a small group of people, the AI therapist could open up therapy to a much greater number of people. However, as with any technology that relies on the Internet, it’s important to remember that although the service doesn’t have a price tag, it’s not necessarily free. AI therapy could reinforce the limited access to therapy based on wealth and class, even as it tries to expand the reach of mental healthcare.