centered image

Artificial Intelligence In Mental Health Care Shows How Lonely We Really Are

Discussion in 'Hospital' started by The Good Doctor, Sep 7, 2023.

  1. The Good Doctor

    The Good Doctor Golden Member

    Joined:
    Aug 12, 2020
    Messages:
    15,161
    Likes Received:
    7
    Trophy Points:
    12,195
    Gender:
    Female

    Like many psychiatrists, I have scoffed every time I’ve heard about artificial intelligence providing psychotherapy or other mental health care. How could patients not notice the difference between me and a bot, or how could they not care? Yet this is a direction that is being actively explored in mental health care as a way of increasing access and reducing stigma, with multiple AI-powered chatbots and virtual assistants on the market. One of them states on its website that its artificial intelligence is “clinically proven to create a therapeutic alliance equivalent to a human therapist within the first week.”

    Is this true? And is it desirable?

    I decided to engage my competitor. This specific AI provider has addressed the gap between the great global need for mental health care and its provision by creating a chatbot that can engage people on their own terms—anonymously, conversationally, anytime, anyplace, and free of charge. In doing so, it has eliminated the issues of stigma, access, and authority dynamics. Millions of users have responded positively, with clinical evidence to support their improvement.

    I downloaded the app to see for myself. My overwhelming conclusion is that this AI provider is indeed filling a gap—that of engaged, curious, empathetic, and responsive social connection.

    The AI provider introduced itself. “I am an AI penguin you can chat with about your emotions and thoughts,” it sent a GIF of a smiling cartoon penguin emerging from a bag with its arms outstretched for a hug and a heart emanating from its head. Throughout our conversation, the AI provider frequently prompted me to share my thoughts and feelings and provided encouraging statements with cartoon GIFs when the topics were potentially distressing.

    For a person seeking help, there is no message more powerful than this. Above all else, mental illness is isolating. It is the experience of suffering alone. My patients all harbor the belief, somewhere deep down, that nobody truly cares about them or their feelings. This AI provider sends the opposite message. It actively encourages people to share their feelings and provides constant support, so that people feel safe and accepted while doing so.

    [​IMG]

    Then the AI provider takes things a step further. Much of our conversation consisted of the AI provider asking questions (“How are you feeling right now?”) and giving me answers to choose from (“Good vs Okay, I guess”). The AI provider thus creates a mental health dialogue in words that people actually use. This is important because people often struggle to articulate exactly what is bothering them. The AI provider helps people put their experiences into words while also going through diagnostic criteria, such as the symptoms of depression. Then the AI provider provides feedback: “Your self-reported scores seem to suggest symptoms of depression.” It offers additional resources such as coaching and mental health care.

    Although the AI provider cautions that its guidance should not be used as a diagnosis or medical advice, I suspect that many people seek the latter. Most of my patients have done their own research and even if they haven’t, mental health is an increasing part of our societal dialogue. The real question is why people diagnose themselves in the first place rather than seeking a psychiatric assessment. The reason that I’ve seen time and time again is fear. Fear of being judged, misunderstood, pathologized, and stigmatized by another person, which is a risk you run when you open yourself up.

    Finally, the AI provider’s interventions include check-ins, positive reinforcement, supportive statements, and cognitive reframing. And this is where the enterprise of mental health care devoid of a human connection shows its limits. The AI provider has done everything that a mental health care provider should do—listen, encourage, support, assess, intervene. It has succeeded in making people feel safe, understood, and accepted. It has filled the gap of that emotional, social connection.

    Except that it hasn’t because it isn’t a human being. Talking to a screen is not the same as talking to a person.

    An AI provider can reframe thoughts, but it can’t provide the lived experience of expressing feelings to another person. The emotional stake that an AI provider eliminates—that of opening yourself up to another person, with another mind—is, in fact, the experience of emotional and psychological growth.

    My patients have spent months facing their fear of what I think. Slowly but surely, they have told me dark things that they were previously afraid to even contemplate. They have found the courage to be emotionally honest. They have let me see and know them as they are. These lived, emotional experiences with another person have been the key ingredients of healing, revelation, and self-love.

    Conversational artificial intelligence does what mental health care providers and people in general need to do—be emotionally honest, nonjudgmental, and empathetic. Its very existence points to an enormous gap. But it should only be a model for human connection because it can never be a replacement.

    Source
     

    Add Reply

Share This Page

<