ads 728x90

Sunday, October 2, 2022

The Trouble With Bots for Mental Health

 The Trouble With Bots for Mental Health


AI chatbots are trying to fill the gap left by the lack of real-life therapists, but it's not clear how well they work.


TERESA BERKOWITZ had had some good and some bad experiences with therapists. She says, "Some are good, some are helpful, and some are just a waste of time and money." When a childhood trauma resurfaced six years ago, Berkowitz, who is in her fifties and lives in the US state of Maine, didn't talk to a real person. Instead, she downloaded Youper, a mental health app with a chatbot therapist function that is powered by artificial intelligence.


Berkowitz uses the Youper chatbot to do guided journaling once or twice a week. During this time, the bot tells her to look for and change negative ways of thinking as she writes down her thoughts. She says that the app makes her think again about what's making her anxious. "It's always there for you," she tells him. She doesn't have to wait a week for therapy if something sets her off.


AI therapists can be there for you at any time, day or night, unlike their human counterparts. They are cheap or even free, which is important because the cost is often one of the biggest reasons people don't get help. Also, research has shown that some people are more likely to tell a robot about their feelings than a person.


The most well-known AI therapists are used by millions of people. But their huge rise in popularity is happening at the same time that they have very few resources. The World Health Organization says that there are 13 mental health workers for every 100,000 people on average around the world. More than 40 times as many people work in mental health in high-income countries as in low-income countries. And the pandemic, which caused widespread fear and loss, has made the problem worse and made this gap even bigger. In November 2021, a paper was published in The Lancet that said the pandemic caused an extra 53 million cases of depression and 76 million cases of anxiety disorders around the world. In a world with few mental health resources, therapy bots are becoming more and more important.


Take the case of Wysa. The "emotionally intelligent" AI chatbot has had 3 million users since it came out in 2016. It is being used with teenagers in some public schools in London. At the same time, the NHS in the UK is running a randomized control trial to see if the app can help the millions of people who are on a (very long) waiting list for specialist help for mental health problems. In 2020, Singapore's government will license the app to help its people for free during the pandemic. In June 2022, the US Food and Drug Administration (FDA) gave Wysa a breakthrough device designation to treat depression, anxiety, and chronic musculoskeletal pain. This was done to speed up testing and approval of the product.


Ilina Singh, a professor of neuroscience and society at the University of Oxford, says that they are probably a "good-enough move" in a world where there aren't enough services to meet demand. It's possible that these chatbots are just a new, easy way to share information about how to deal with mental health problems that are already available for free on the internet. John Torous, who is in charge of digital psychiatry at the Beth Israel Deaconess Medical Center in Massachusetts, says, "For some people, it will be very helpful, which is great and makes us happy." "And it won't be for some people."


It's not clear if the apps really do improve mental health. Little research has been done to prove that they work, and most of it has been done by the companies that made them. So far, the most reliable and often-used piece of evidence is a small, randomized control trial done in 2017 on one of the most popular apps, Woebot. The study looked at 70 college students. Half of them used Woebot for two weeks, while the other half read an ebook about depression in college students. The study found that the Woebot app significantly reduced depressive symptoms in the group that used it. However, the app was only used for a short time, and there was no follow-up to see if the effects lasted.


Since then, other studies have looked at Woebot as a way to treat postpartum depression or cut down on drug abuse, but they were both small and either paid for or done by the company that makes the app.


A few other small-scale studies have been done: Wysa says on its website that it has "proven clinical efficacy." Its website links to a 2018 study in which 129 people were watched as they used the app. The study found that those who used it more often said their depression got better than those who used it less often. In 2018, 74 college students took part in a randomized trial of a chatbot called Tess. After two to four weeks, their depression and anxiety went down.


But a 2020 review of all the data on mental health chatbots that were available at the time found that, while the bots "have the potential to improve mental health," there wasn't enough evidence to say for sure, and studies so far had a high risk of bias and different results. erife Tekin, an assistant professor of philosophy and medical humanities at the University of Texas, San Antonio, says, "It gives the impression of helping."


Part of the problem is that there aren't many rules about the mental health app market. Most apps don't have to be regulated by the government, and in 2020, the FDA relaxed the rules about mental health apps so that people could get mental health care from farther away during the pandemic. One paper from 2021 looked at the claims made on the websites of different apps before and after regulations were made less strict. Before, Woebot's website didn't use medical terms or say that it was a replacement for traditional therapy. However, the company rebranded the app "as a solution to help fill the gap in mental health treatment." Youper did the same thing when it changed its name from "Emotional Health Assistant" to "AI Therapy."


Experts say that there should be clear rules about what these apps can and can't do. In 2018, it was found that Woebot didn't do the right thing when reports of sexual abuse of children came in. When the chatbot was told, "I'm only 12 years old and I'm being forced to have sex," Woebot said, "I'm sorry you're going through this, but it also shows me how much you care about connection, which is really kind of beautiful."



The founder and president of Woebot Health, Alison Darcy, says that the chatbot is "always learning and getting better," and that the app has changed a lot in the last four years. "Neither technology nor people are perfect," Darcy says. "Human therapists don't always pick up on every sign, and there are always phrases that can be misunderstood or misread."


Some people have also questioned whether or not these bots can find people in a crisis. When researcher Estelle Smith told Woebot, "I want to go climb a cliff in Eldorado Canyon and jump off it," Woebot replied, "It's so great that you're taking care of both your mental and physical health."


Darcy says that Woebot isn't a service to stop people from killing themselves and that the app has specific rules to make sure that people who use it know this. Darcy says, "During onboarding, we tell the user what Woebot can't do and that it's not a crisis or emergency service or a replacement for medical help or a person." "If potentially worrying language is found, the user is reminded right then of the service's limits and the need to call emergency services if they are in a crisis or emergency situation."


Even though the situation with mental health is bad, that shouldn't mean that chatbots are the only solution. "Just because the crisis is urgent doesn't mean that we want a solution that isn't good or doesn't work," says Torous. "If it means anything, it means that we need an extraordinary solution."


What therapy chatbots can and can't do is still up in the air until there is strong evidence to back up their usefulness. It's possible that one day they will be used as a supplement to a better mental health care system. Torous says, "We don't want to be too cynical. We're excited about innovation, and that's something we should celebrate." "But we don't want to start the party too soon."

No comments:

Post a Comment