“I think about suicide constantly these days,” Drew said.
The counselor reassured Drew – thanking him for reaching out to speak, telling him he was not alone – and detailed how Drew planned to kill himself.
“Are you going to commit suicide today?” asked the consultant.
It’s a difficult conversation to read, even knowing that Drew is not a real person, but an artificially intelligent chatbot created by The Trevor Project, a crisis and suicide intervention group for suicides and crises. LGBTQ youth.
The overall training for new mentors who will respond to messages and chats takes months, and role-playing is a big part of that. Hopefully, with the help of capable chatbots like Drew, the nonprofit can train more mentors more quickly by conducting immersive sessions where everyone participates.
“You can watch a lot of training videos and you can read all the handbooks. You have a perception of how this is going to happen. But really do it and feel what it feels like to get involved. one of these conversations, even if it was simulated, was Dan Fichter, head of engineering and artificial intelligence for Project Trevor.
Drew and Riley
Drew is the second chatbot the team has launched this year – part of what Project Trevor calls a “Crisis Contact Simulator” – and it tackles more complex problems than its predecessor. . The first chatbot, named Riley, represented a depressed North Carolina teenager dealing with issues related to being a sexist; Riley was created with help and $2.7 million in funding from Google’s charitable arm, Google.org. (Drew was developed internally by The Trevor Project.)
Project Trevor says it has started using Drew with Riley over the past few months and has trained more than 1,000 digital volunteer mentors with chatbots to date. It has a total of 1,205 digital advisors.
In November, Project Trevor gave CNN Business a peek at how the training works in real time via a demo video of a conversation conducted by a trained advisor with the Drew chatbot. The conversation proceeds slowly, with the mentor gradually asking more personal questions about Drew’s age, location, etc. in hopes of building trust with Drew and, over time, a risk assessment. his suicidal behavior and seek to help him. At one point, the counselor empathized with how difficult it must be to be harassed in the workplace and asked Drew how his relationship with his boss was.
“She told me to just ignore it and be an adult but she didn’t understand how scary that was to me,” Drew replied.
The frequent pauses at the end of Drew, which seem to vary in length, add to the sense of tension of the conversation. Kendra Gaunt, Project Trevor data and AI product manager and the trained advisor who recorded the demo, says that after launching Riley, different pauses between responses were added to better simulate how a person contacting Project Trevor might switch between devices or tasks.
At the end of the conversation, a coach at Project Trevor will review the transcript and provide feedback to the student. Students also participate in a number of role-play sessions led by Project Trevor instructors.
“While this doesn’t have to be a real conversation with a living young adult, these do reflect why people sought Trevor’s support in the first place,” Gaunt said.
“sorry idk :/”
“sorry, idk:/..” Drew typed in reply.
The Trevor Project is trying to use this weakness to its advantage: This kind of feedback is, in a sense, a good thing for the mentor to train against, so they can think of another way to do it. express their question. to get better feedback.
Additionally, Fichter says, “Part of the experience of helping Drew involves a new counselor learning to sit with the discomfort of not being able to work through everyone’s problems in a conversation.”
Fichter points out that interns will also only learn about Drew’s thoughts of suicide if they probe, and this is meant to help them get used to asking tough questions in a direct way.
“For most interns, Riley and Drew are the first time they can type the words, ‘Are you thinking about suicide?’,” Fichter said.
In addition to the common language training for Project Trevor’s Crisis Contact Simulator, Drew and Riley’s personalities were built using data from recordings of text-based conversations that were previously used to crisis counselor training – not details from conversations between people contacting The Trevor Project and counselors.
Maggi Price, an assistant professor at Boston University who studies how to improve healthcare services for transgender youth, said she worries about chatbots being able to represent a good real person. like because it’s trained on simulated interactions with counselors, rather than actual conversations. However, she sees the potential to use this type of chatbot to train counselors, of which there are few – especially when it comes to those with the expertise to work with transgender clients.
“There is such a mental health crisis right now and there is a huge lack of resources in gender-affirming care, especially care for LGBTQ,” she said. “I think overall, it looks really promising.”
Joel Lam, who works in finance for Project Trevor and completed a consultant training course earlier this year with the Riley chatbot, says it feels surprisingly natural to communicate with a company. automatic tool. He also said he felt a little less stressed doing the casting knowing there wasn’t really another person on the other end of the conversation.
After several monthly changes to the crisis hotline, he said he can confirm that the chatbot behaves like a human, in part simply because of the way it pauses before responding to a question. counselor.
During training, he said, “I was thinking, ‘Maybe there’s a real person behind that.'”