Home  >  Article  >  Technology peripherals  >  Would you like to open up to an AI therapist?

Would you like to open up to an AI therapist?

王林
王林forward
2023-05-02 21:28:05832browse

We are increasingly turning to smart voice assistants or chatbots on websites and apps to answer questions.

Would you like to open up to an AI therapist?

As these systems powered by artificial intelligence (AI) software become more sophisticated, they are starting to provide pretty good, detailed answers.

But will such a chatbot be as effective a therapist as a human?

Computer programmer Eugenia Kuyda is the founder of US-based chatbot app Replika, which says it provides users with “a caring AI companion, always here to listen and talk, always there for you around".

It was launched in 2017 and currently has over 2 million active users. Each person has a chatbot or "replica" that is unique to them, as the AI ​​learns from their conversations. Users can also design their own cartoon avatar for their chatbot.

People who use the app range from autistic children as a way to "warm up before interacting with people," to adults who are simply lonely and in need of friends, Ms. Kuyda said.

Others are said to have used Replika to practice for job interviews, talk politics, and even serve as a marriage counselor.

While the app is primarily designed to be a friend or companion, it also claims it can help improve your mental health, such as allowing users to "form better habits and reduce anxiety."

According to the World Health Organization (WHO), nearly 1 billion people worldwide live with a mental disorder, more than one in 10 people worldwide.

The World Health Organization added that “only a small proportion of people in need have access to effective, affordable and quality mental health services”.

While anyone concerned about themselves or a relative should see a doctor first, the growth of chatbot mental health therapists could provide some welcome support to many people.

Dr Paul Marsden, a member of the British Psychological Society, said apps aimed at improving mental health can help, but only if you find the right one, and only in limited ways.

"When I looked, there were 300 apps just for anxiety...so how do you know which one to use?

"They should only be considered as a supplement to in-person therapy . The consensus is that apps will not replace human treatments.

Yet at the same time, Dr. Marsden said he was excited about the power of artificial intelligence to make therapeutic chatbots more effective. “Mental health support is based on talk therapy, and what chatbots do is talk,” he said. ” he said. Dr. Marsden highlighted the fact that leading AI chatbot companies, such as OpenAI, the company behind the recent high-profile ChatGPT, are opening up their technology to other companies.

This enables mental health apps to use the best artificial intelligence to power their chatbots "with its rich knowledge, ever-increasing reasoning capabilities and adept communication skills," he said. Replika is One such provider is already using OpenAI technology.

But what if a person’s relationship with their chatbot therapist becomes unhealthy? Replika made headlines in February when it was revealed that some Users have been having explicit conversations with their chatbots.

These news reports come after Luka, the company behind Replika, updated its artificial intelligence system to prevent such sexual exchanges.

Not all users were happy about the change. One wrote on Reddit: "People found refuge from loneliness, healing through intimacy, and suddenly discovered that it was artificial, not because it was artificial intelligence, but Because it is controlled by people. ”

Luca’s move may be related to the fact that, also in February, Italy’s data protection agency banned it from using the personal data of Italians.

The Italian regulator claimed that the app was used by people under the age of 18 People were using it and they were getting "responses that were absolutely inappropriate for their age". It added that the app could also "increase the risk vulnerability of individuals who are still in their developmental stages or emotional states".

The move could Replika's use in Italy will be restricted, and Luka could be fined. It said it was "working closely with Italian regulators and conversations are progressing actively."

British online privacy campaigner Jen Persson said there needed to be Chatbot Therapist Gets More Global Regulation.

She said: “If an AI company’s product claims to identify or support mental health, or is intended to affect your emotional state or mental health, it should be classified as a health product and adhere to quality and safety standards accordingly.”

Ms Kuyda considers Replika a companion, like having a pet, rather than a mental health tool. It shouldn't be viewed as a substitute for help from a human therapist, she added.

“Real life therapy provides incredible insight into the human psyche, not just through words or words, but by seeing you firsthand, seeing your body language, your emotional responses and Incredible knowledge of your history,” she said.

Headspace CEO Russell Glass says the focus of its app will remain on person-to-person communication

Other apps in the mental health space are wary of using artificial intelligence at first Much more. One of these is meditation app Headspace, which has more than 30 million users and is NHS-approved in the UK.

“Our core beliefs and entire business model at Headspace Health are rooted in person-centered and person-centered care—the connections our members make through live conversations with coaches and therapists through chat, video, or in-person is irreplaceable,” said Headspace executive Russell Glass.

He added that while Headspace does use some artificial intelligence, it does so "highly selectively" while maintaining a "depth of human engagement." The company doesn't use AI to chat with users, and Mr. Glass said it only uses it for things like giving users personalized content recommendations or assisting human care providers in writing notes.

However, Dr. Marsden said AI-powered therapy chatbots will only continue to get better. "New AI chatbot technology appears to be developing skills for effective mental health support, including empathy and understanding of how the human mind works," he said.

His comments follow a recent study from Cornell University in New York state that put ChatGPT through a series of tests to see how well people understood that others might have different ideas. The AI's score is equivalent to that of a nine-year-old child.

Previously, this type of cognitive empathy was thought to be unique to humans. ​

The above is the detailed content of Would you like to open up to an AI therapist?. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact admin@php.cn delete