Lifestyle

How Chatbots Are Being Used to Train Crisis Counselors


“I think about suicide constantly these days,” Drew said.

The counselor reassured Drew – thanking him for reaching out to speak, telling him he was not alone – and detailed how Drew planned to kill himself.

“Are you going to commit suicide today?” asked the consultant.

It’s a difficult conversation to read, even knowing that Drew is not a real person, but an artificially intelligent chatbot created by The Trevor Project, a crisis and suicide intervention group for suicides and crises. LGBTQ youth.

While chatbots are often seen as a necessary (and sometimes obnoxious) development of online customer service, Drew’s aim is far from helping customers do things like return a pair of pants or Get an insurance quote. Drew simulates conversations with crisis volunteer-mentor-in-training, who will reach out to staff on Project’s always-available text and chat-based helplines Trevor (the team also has a 24/7 staff phone line). LGBTQ youth have a higher risk of depression and suicide than other youth, and research suggests this may have been made worse during the pandemic due to factors such as isolation from school closures.

The overall training for new mentors who will respond to messages and chats takes months, and role-playing is a big part of that. Hopefully, with the help of capable chatbots like Drew, the nonprofit can train more mentors more quickly by conducting immersive sessions where everyone participates.

“You can watch a lot of training videos and you can read all the handbooks. You have a perception of how this is going to happen. But really do it and feel what it feels like to get involved. one of these conversations, even if it was simulated, was Dan Fichter, head of engineering and artificial intelligence for Project Trevor.

A chatbot named Drew is helping Project Trevor train volunteer crisis counselors to provide employees with text and chat helplines.

Drew and Riley

Drew is the second chatbot the team has launched this year – part of what Project Trevor calls a “Crisis Contact Simulator” – and it tackles more complex problems than its predecessor. . The first chatbot, named Riley, represented a depressed North Carolina teenager dealing with issues related to being a sexist; Riley was created with help and $2.7 million in funding from Google’s charitable arm, Google.org. (Drew was developed internally by The Trevor Project.)

This $5 billion insurance company likes to talk about its AI.  Now it's messed up because of it

Project Trevor says it has started using Drew with Riley over the past few months and has trained more than 1,000 digital volunteer mentors with chatbots to date. It has a total of 1,205 digital advisors.

In November, Project Trevor gave CNN Business a peek at how the training works in real time via a demo video of a conversation conducted by a trained advisor with the Drew chatbot. The conversation proceeds slowly, with the mentor gradually asking more personal questions about Drew’s age, location, etc. in hopes of building trust with Drew and, over time, a risk assessment. his suicidal behavior and seek to help him. At one point, the counselor empathized with how difficult it must be to be harassed in the workplace and asked Drew how his relationship with his boss was.

“She told me to just ignore it and be an adult but she didn’t understand how scary that was to me,” Drew replied.

The frequent pauses at the end of Drew, which seem to vary in length, add to the sense of tension of the conversation. Kendra Gaunt, Project Trevor data and AI product manager and the trained advisor who recorded the demo, says that after launching Riley, different pauses between responses were added to better simulate how a person contacting Project Trevor might switch between devices or tasks.

At the end of the conversation, a coach at Project Trevor will review the transcript and provide feedback to the student. Students also participate in a number of role-play sessions led by Project Trevor instructors.

“While this doesn’t have to be a real conversation with a living young adult, these do reflect why people sought Trevor’s support in the first place,” Gaunt said.

“sorry idk :/”

While AI chatbots have advanced significantly In recent years, they still have many limitations. Chatbots like Drew and Riley are built using large language models, are AI systems that can generate text that is almost indistinguishable from what a human would write. So while they can realistically respond to human queries, they may also reflect Internet bias, since that’s what those models are trained on. And they can’t always answer a question, or answer it well. For example, at one point in the conversation, the mentor asked Drew how it felt to talk to his boss about problems he was having with a co-worker.

“sorry, idk:/..” Drew typed in reply.

The Trevor Project is trying to use this weakness to its advantage: This kind of feedback is, in a sense, a good thing for the mentor to train against, so they can think of another way to do it. express their question. to get better feedback.

& # 39;  It's just human dignity.  & # 39;  Transgender writers and journalists struggle to correct old lines

Additionally, Fichter says, “Part of the experience of helping Drew involves a new counselor learning to sit with the discomfort of not being able to work through everyone’s problems in a conversation.”

Fichter points out that interns will also only learn about Drew’s thoughts of suicide if they probe, and this is meant to help them get used to asking tough questions in a direct way.

“For most interns, Riley and Drew are the first time they can type the words, ‘Are you thinking about suicide?’,” Fichter said.

“Scarce Resources”

In addition to the common language training for Project Trevor’s Crisis Contact Simulator, Drew and Riley’s personalities were built using data from recordings of text-based conversations that were previously used to crisis counselor training – not details from conversations between people contacting The Trevor Project and counselors.

Sometimes the chatbot doesn't have an answer to a question, which can lead the training consultant to ask the question in a different way.

Maggi Price, an assistant professor at Boston University who studies how to improve healthcare services for transgender youth, said she worries about chatbots being able to represent a good real person. like because it’s trained on simulated interactions with counselors, rather than actual conversations. However, she sees the potential to use this type of chatbot to train counselors, of which there are few – especially when it comes to those with the expertise to work with transgender clients.

“There is such a mental health crisis right now and there is a huge lack of resources in gender-affirming care, especially care for LGBTQ,” she said. “I think overall, it looks really promising.”

Joel Lam, who works in finance for Project Trevor and completed a consultant training course earlier this year with the Riley chatbot, says it feels surprisingly natural to communicate with a company. automatic tool. He also said he felt a little less stressed doing the casting knowing there wasn’t really another person on the other end of the conversation.

After several monthly changes to the crisis hotline, he said he can confirm that the chatbot behaves like a human, in part simply because of the way it pauses before responding to a question. counselor.

During training, he said, “I was thinking, ‘Maybe there’s a real person behind that.'”

Editor’s Note: If you or a loved one is contemplating suicide, call National Suicide Prevention Hotline at 1-800-273-8255 or text TALK to 741741. International Association for Suicide Prevention and Friends around the world also provides contact information for crisis centers around the world.

.



Source link

news7h

News7h: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button