AIRLINE’S AI goes awry

Air Canada Ordered to Pay Customer $812.02 for Bereavement Refund Policy Made Up by its Own Website’s Chatbot

In the age of companies pushing automated customer service options, a recent court decision may serve as a cautionary tale on the legal ramifications of artificial intelligence. 

​​On November 11, 2022, Air Canada customer Jake Moffat’s grandmother passed away in Ontario, Canada. That same day, he visited Air Canada’s website to find and book a flight from Vancouver to Toronto using Air Canada’s bereavement rates. In order to gain some insight into Air Canada’s bereavement policy, he took to the website’s chatbot. According to the customer, he asked the AI chatbot for bereavement rates offered by the airline. In response, the chat stated:

If you need to travel immediately or have already travelled (sic) and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.

The only problem . . . this is not actually the airline’s bereavement policy. In reality, Air Canada’s policy explicitly states that bereavement rates cannot be requested retroactively. 

Unbeknownst to the customer, and relying on the information he obtained from the chat, he proceeded to book his flights. According to court filings, the customer also called Air Canada to determine what the refund rate would be. The record is silent on whether the phone call discussed whether he could recover a bereavement rate under his circumstances.

When he tried to recover the communicated bereavement rate, his refund request was rejected. An Air Canada representative admitted the chatbot had provided “misleading words,” but pointed to a hyperlink contained in the chat with the actual bereavement policy held by Air Canada.  As a consolation, Air Canada stated it would be updating the chatbot to reflect the correct policy and offered him a $200 coupon. 

The customer refused the $200 coupon offer, and proceeded to file suit against the airline in the Small Claims Court of the Civil Resolution Tribunal. The Tribunal has jurisdiction over small claims brought under Civil Resolution Tribunal Act (CRTA) section 118. CRTA section 2 states that the CRT’s mandate is to provide dispute resolution services accessibly, quickly, economically, informally, and flexibly. In resolving disputes, the CRT must apply principles of law and fairness.

The main issue the Tribunal sought to answer was “Did Air Canada negligently misrepresent the procedure for claiming bereavement fares, and if so, what is the remedy?”

Air Canada argued that it cannot be held liable for information provided by one of its agents, servants, or representatives–which includes a chatbot. According to the Tribunal’s formal written opinion, the airline did not explain why it believed it cannot be held liable for the information provided by its chatbot. Further, the correct bereavement policy information could be found elsewhere on the website.

The Tribunal reasoned that while a chatbot has an interactive component, it is still just a part of the overall website. Therefore, it should be “obvious” to Air Canada that it is responsible for all the information on its website–regardless of whether the information comes from a static page or a chatbot. 

In all, the Tribunal found Air Canada did not take reasonable care to ensure its chatbot’s information was accurate. Regarding the ability to find the correct information on a different area of the website, Air Canada was unable to explain why the webpage with the correct Bereavement information was inherently more trustworthy than its chatbot–also reasoning that customers should not have to double-check information found in one part of its website on another part of its website.

Based on the evidence presented by the parties, the Tribunal ordered the retroactive bereavement discount be paid to the customer. 

The key takeaway? Companies can be liable for what their artificial intelligence says and does. This case is quickly being considered a landmark decision that potentially sets the tone for customer recourse amidst airlines’ (and most other industries’) growing use of artificial intelligence and chatbots to answer customer service questions. The question of how to ensure AI’s information is accurate remains. 

The Order can be found online at the Canadian Legal Information Institute, Indexed as: Moffatt v. Air Canada, 2024 BCCRT 149. 

This is not the first time artificial intelligence has caused issues for professional industries. Stay tuned for more on when the use of artificial intelligence goes awry in the legal profession. 

Xoxo, Tessquire