
Air Canada has been ordered to compensate a passenger who was misinformed by the customer assistance chatbot on their website, after initially denying responsibility.
A grieving Air Canada customer was advised by a chatbot he could buy a full-price ticket and submit a request within 90 days for a part refund for bereavement purposes. However, the airline claimed the chatbot on its own website was mistaken, and that they had no obligation or responsibility to honour its claims.
However, Air Canada failed to persuade the Civil Resolution Tribunal and has been ordered to reimburse the customer.
That customer was Jake Moffatt, whose grandmother passed away on Remembrance Day (November 11) 2022. That same day Moffatt visited Air Canada's website to purchase return airfare tickets to Toronto to support his grieving family. Like nearly any website today, he was greeted by a support chatbot. Asking about bereavement rates, the chatbot explicitly stated, "If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form."
{loadposition david08}
With this assurance, Moffat purchased a full fare of $1,630.36. However, on return and after attempting to receive a partial refund based on bereavement rates he was refused by Air Canada. The airline stated it did not, in fact, offer any such partial refund on travel that had been completed. It pointed Moffatt to a policy listed elsewhere on their website that stated this.
Moffatt referred Air Canada to the chatbot transcript. A representative from the airline conceded the chatbot had uttered "misleading words" but still denied any obligation to refund Moffatt, on the basis it had a stated policy in a different section of its website. The representative did, however, say they would use Moffatt's feedback to better train the chatbot.
Moffatt was unimpressed and launched legal action against Air Canada in the Canadian Civil Resolution Tribunal (CRT). Air Canada argued in its defence that it cannot be held liable for the information provided by its agents, servants, or representatives, which includes chatbots as well as human operators. Further, Air Canada argued the chatbot was a separate legal entity, which is responsible for its own actions.
If that argument makes you do a double-take, you're not alone. CRT member Christophers Rivers stated, "this is a remarkable submission."
Air Canada additionally argued the correct policy details were available on their website, albeit in a different section to the chatbot.
The CRT released its decision this month, and ordered Air Canada to refund Moffatt $812, which covered the difference between the full fare he paid, and the reduced bereavement rate along with interest and CRT fees. The Tribunal was unimpressed by Air Canada's case, and found the airline provided no credible support for its assertion it cannot be liable for the information provided in its name.
"While a chatbot has an interactive component, it is still just a part of Air Canada's website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot," Rivers said.
And with respect to the policy document located elsewhere, Rivers stated, "there is no reason why Mr Moffatt should know that one section of Air Canada's webpage is accurate, and another is not."
It's a sombre lesson to organisations of all sizes around the world; with artificial intelligence the hot topic of 2024, it's imperative to embrace responsible AI. Otherwise, if your chatbot is prone to hallucinations you might find yourself in the same plight of Air Canada.
"I find Air Canada did not take reasonable care to ensure its chatbot was accurate," Rivers said.
The matter falls under "negligent misrepresentation". Rivers said, which he clarified by saying, "Negligent misrepresentation can arise when a seller does not exercise reasonable care to ensure its representations are accurate and not misleading."
"I find Mr. Moffatt has made out their claim of negligent misrepresentation and is entitled to damages," Rivers said in his decision.