FANews
FANews
RELATED CATEGORIES
Category Legal Affairs
SUB CATEGORIES General | 

Company is liable for Chatbot Miscommunications holds Canadian Tribunal

28 February 2024 Tristan Marot, Norton Rose Fulbright
Tristan Marot

Tristan Marot

The Canadian case of Moffatt v. Air Canada considers a dispute concerning a refund for a bereavement fare. Jake Moffatt, the applicant, booked flights with Air Canada following the death of his grandmother. He was informed by an AI powered chatbot on Air Canada’s website that he could apply for bereavement fares retroactively. However, when attempting to do so, a human employee of Air Canada informed him that retroactive applications for bereavement fares were not permitted. Moffatt sought a partial refund of $880, representing the difference between the regular fare and the alleged bereavement fare.

Air Canada contended that Moffatt did not follow the correct procedure for requesting bereavement fares and therefore could not claim them retroactively, relying on certain terms from its Domestic Tariff to support its case for dismissing Moffatt’s claim. Further, Air Canada contended that it could not be held liable for the information provided by the chatbot, implying that the chatbot, as an automated system, operates independently of Air Canada’s direct control.

On Air Canada’s argument that the chatbot operated independently from Air Canada’s direct control, the Tribunal found this submission to be remarkable but rejected the notion that a chatbot could be considered a separate legal entity. The Tribunal’s decision highlighted that a chatbot is merely a part of Air Canada’s website, designed to interact with users based on programmed responses. It emphasized that Air Canada is responsible for all information on its website, regardless of whether it is delivered through a static webpage or an interactive chatbot. The Tribunal underscored that it is incumbent upon Air Canada to ensure the accuracy and reliability of all representations made on its website, including those made by automated systems like chatbots.

The Tribunal mostly allowed Moffatt’s claim. The Tribunal found that Air Canada, through its chatbot, negligently misrepresented the procedure for claiming bereavement fares. It was determined that Air Canada owed Moffatt a duty of care to ensure the accuracy of its representations and that Moffatt reasonably relied on the chatbot’s advice which reliance resulted in damages.

The Tribunal ordered Air Canada to pay Moffatt $650.88 in damages for the difference between the bereavement fares he was led to believe he would pay and the actual cost of the flights. Additionally, Air Canada was ordered to pay $36.14 in pre-judgment interest and reimburse $125 in CRT fees, totalling $812.02 to be paid to Moffatt within 14 days. Notably, the chatbot no longer appears on Air Canada’s website.

Tribunal decisions do not set precedent, although they may be persuasive in future decisions. To find out more on the Civil Resolution Tribunal, consider this article.

While a similar issue has yet to be judicially considered in South Africa, it is likely that a South African Court would come to a similar determination.

First published by: Financial Institutions Legal Snapshot

Quick Polls

QUESTION

How effective do you think technology is in improving compliance processes for FSPs?

ANSWER

Very effective – it streamlines and automates processes
Somewhat effective – helps but can't solve all issues
Not effective – technology can't replace proper oversight
fanews magazine
FAnews October 2024 Get the latest issue of FAnews

This month's headlines

The township economy: an overlooked insurance market
FSCA regulates crypto assets: a new era for investors
Building trust: one epic client experience at a time
Two-Pot System rollout underlines the value of financial advice
The future looks bright for construction
Subscribe now