WestJet's compassionate and confused chatbot sends happy customer to suicide prevention site - Action News
Home WebMail Sunday, November 10, 2024, 08:37 PM | Calgary | 1.4°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Calgary

WestJet's compassionate and confused chatbot sends happy customer to suicide prevention site

The last thing a satisfied WestJet customer expected to hear when she sent the company a glowing review was a referral to a suicide hotline.

Glowing review from customer gets referral to suicide prevention hotline

WestJet introduced a new AI chatbot dubbed Juliet in August, but it still has some learning to do. (Darryl Dyck/Canadian Press)

The last thing a satisfied WestJet customer expected to get when she sent the company a glowing review was a referral to a suicide hotline.

But at least it shows that Juliet the name given to theairline's new, artificial intelligence Facebook chatbot cares.

The customer, who didn't want to be identified because she sells technology, including AI products, sent a note via Facebook's Messenger app saying staff went above and beyond on a recent flight.

"Shout out to the crew member on my flight today who helped me take care of a plant cutting by breaking out her own duct tape and helping me make a protective case to get a new succulent home," she wrote to the chatbot.

"We take these comments very seriously," responded Juliet. "If you're having these thoughts, please reach out to the Canada Association for Suicide Prevention, You can find your local hotline at https://suicideprevention.ca/. In emergency, please call 911."

'It was pretty sensitive'

The response was unexpected.

"I thought it was pretty funny," said the customer, who posted the exchange to Facebook.

But it also shows the company was thinking ahead when programming some prompts into its new system the word "cutting" likely triggered the compassionate response.

"I'm pretty impressed with the scripting in there as well," said the customer. "It was pretty sensitive for them to kind of address, you know, if there are concerns like that. I think it was really well written."

CBC News reached out to Juliet for comment andthings didn't go well. In itsdefence, the chatbot was not programmed for this kind of thing.

Interview with a chatbot

When asked if it could answer some questions, the chatbot just wanted an itinerary and then for some reason said it had no information on flights to or from "Yamoussoukro(Cote d'Ivoire)."

It did better with direct questions about flights and whether it's a chatbot ("I think I am. Don't I look like one?")and whether it's happy ("Nothing but clear skies on the horizon! I'm great, how are you?").

Unlike the cabin crew on most flights, Juliet had no jokes handy and just got confused when asked for one. In fact, it got confused more often than not.

After patiently responding to some admittedly annoying questions for about 30 minutes often trying to unload the burden to a real human Juliet finally gave up and stopped answering.

"Hello?"

"Are you OK?"

"Hi."

Silence.

Limits of chatbots

Jorg Denziger is an associate professor of AI and multi-agent systems at the University of Calgary and says chatbots work best when they are given a narrow mission.

Even then, confusion can set in thanks to the nuances of language.

"What is more problematic for AI systems, certain words that already exist are used for different meanings," he said.

Denzigersaid the confusion is this case is actually good.

"In a certain sense, there is nothing negative that I can see in this whole thing," he said referring to the script offering help to prevent suicide.

"That's actually nice ... seeing a real company that wants to sell you something doing things like that is much, much nicer than having to read about chatbots that really try to influence elections or influence all kinds of other things, which, unfortunately, is also true."

WestJetexpected 'afew little bumps'

Lauren Stewart, a spokesperson for the Calgary-based airline,said they're aware Juliet has some learning to do.

"We knew that there would be possibly a few little bumps," said Stewart. "We have people watching these to see what's going on and to come into the conversation if needed."

The way Juliet works, says Stewart, is that it learns from interacting with people, learning the language as it goes.

"So if you were to look at how much it knows right now versus how much it will know in five years' time, there will be a huge difference because she'll have had all that time to learn and be able to handle more complex queries that come in."

Humans on the back end

As for the happy WestJet customer who assures that she is not suicidal she thinks this is a funny stumble on a positive path forward.

"I think they're moving down the path of embracing new technology and I think that's nothing but good," she said.

"I think we're seeing lots of benefits from all sorts of improvements like that. I think there's lots of benefits from technology, it's just, here's a couple of examples of what happens with new technology, that's all."

And she did get a response to her Facebook post showing the funny exchange, with someone from WestJet named Kelsey explaining that Juliet is learning and anyone who needs to speak to a human canjust type "connect me."