A Canadian tribunal has ruled that Air Canada must pay damages to one of its passengers for misleading advice given by its customer service chatbot, which resulted in the passenger paying nearly double for their plane tickets.

The case centered on the experience of Jake Moffatt, who flew round-trip from Vancouver to Toronto after his grandmother died in 2022. At the time, Moffatt visited Air Canada’s website to book a flight using the company’s bereavement rates. According to tribunal documents, Moffatt specifically asked Air Canada’s support chatbot about bereavement rates and received the following reply:

“Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family,” the chatbot stated, including an underlined hyperlink to the airline’s policy. “If you need to travel immediately or have already traveled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.”

Advertisement

Moffatt didn’t visit the link provided by the chatbot, which stated that, contrary to what the chatbot said, customers couldn’t apply for bereavement rates after they completed their travel.

Advertisement

The same day he spoke to the chatbot, Moffatt called Air Canada to get more information on the possible amount of the flight discount. He claims a human customer service representative told him that he would receive a discount of about 440 Canadian dollars ($326 U.S. dollars) per flight but wasn’t told the discount couldn’t be applied retroactively. Based on the information from the chatbot and the human customer service representative, Moffatt booked his flights.

Advertisement

A few days later, Moffatt submitted his application for a partial refund of what he had paid for his flights, which totaled 1,630 Canadian dollars ($1,210 U.S. dollars). After debating with the airline for weeks, Moffatt sent Air Canada a screenshot of the chatbot’s response in February 2023. In response, the human customer service representative told him the chatbot’s advice had been “misleading” and said they would take note of the issue so Air Canada could update the chatbot.

Moffatt’s back-and-forth with Air Canada continued and eventually ended up in the Civil Resolution Tribunal, also known as the CRT, a quasi-judicial tribunal in the British Columbia public justice system that deals with civil law disputes like small claims. Moffatt represented himself in the case, while Air Canada was represented by an employee.

Advertisement

In its defense, Air Canada denied all of Moffatt’s claims and said it couldn’t be held liable for information provided by its servants, agents, representatives, or chatbots—an argument that baffled Tribunal member Christopher C. Rivers. In a decision published this week, Rivers said that Air Canada’s suggestion that its chatbot was a “separate legal entity responsible for its own actions” didn’t make any sense.

“While a chatbot has an interactive component, it is still just a part of Air Canada’s website,” Tribunal member Christopher C. Rivers wrote. “It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”

Advertisement

Rivers added that Air Canada didn’t take reasonable care to ensure that its chatbot was accurate. It also didn’t explain why customers should have to double-check information found in one part of its website, the chatbot, with another part of its website. In the end, Rivers ordered Air Canada to pay Moffatt the refund he had spent nearly a year and a half fighting for.

All in all, the tale goes to show big companies that mistakes like “my chatbot did it, not me” won’t fly in court.

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums

Advantages of local domestic helper.