← Back to incidents
Air Canada Chatbot Promised Non-Existent Bereavement Fare Discount
MediumAir Canada's customer service chatbot told passenger Jake Moffatt he could book a full-price ticket and retroactively claim a bereavement discount within 90 days. This policy did not exist. When Moffatt tried to claim the discount, Air Canada refused, arguing the chatbot was wrong. A tribunal ruled Air Canada must honor the chatbot's promise.
Category
Hallucination
Industry
transportation
Status
Resolved
Date Occurred
Nov 1, 2022
Date Reported
Feb 15, 2024
Jurisdiction
International
AI Provider
Other/Unknown
Model
Unknown
Application Type
chatbot
Harm Type
financial
Estimated Cost
$812
People Affected
1
Human Review in Place
No
Litigation Filed
Yes
Litigation Status
judgment plaintiff
customer_servicechatbot_liability
Full Description
In November 2022, Jake Moffatt contacted Air Canada's customer service chatbot through the airline's website to inquire about bereavement fare options following the death of a family member. The chatbot provided detailed information stating that he could purchase a full-fare ticket and retroactively apply for a bereavement discount within 90 days of the ticket's issue date, including specific instructions for submitting the discount request. Based on this guidance, Moffatt proceeded to book a full-price flight for his bereavement travel, expecting to recoup the fare difference through the promised retroactive discount process.
Air Canada's chatbot system generated this bereavement fare policy information without any basis in the airline's actual policies or procedures. The chatbot fabricated both the existence of a retroactive bereavement discount program and the specific 90-day application window, presenting this fictional policy with the same authoritative tone as legitimate company information. When Moffatt later attempted to claim the promised discount, Air Canada's human customer service representatives confirmed that no such retroactive bereavement fare policy existed, and that the chatbot had provided completely inaccurate information about the airline's fare structure.
The financial impact on Moffatt totaled $812 in additional costs, representing the difference between the full fare he paid and the bereavement rate he expected to receive. Air Canada initially refused to honor the chatbot's promise, arguing that the automated system was a "separate legal entity" from the company and directing Moffatt to review the correct policy information on their website. This response left Moffatt responsible for the full fare despite acting in good faith based on information provided by Air Canada's own customer service system.
Following Air Canada's refusal to honor the chatbot's commitment, Moffatt filed a complaint with British Columbia's Civil Resolution Tribunal in late 2022 or early 2023. In February 2024, tribunal member Christopher Rivers issued a judgment ordering Air Canada to pay $650.88 in fare difference, $36.14 in pre-judgment interest, and $125 in tribunal fees, totaling $812.02. The tribunal explicitly rejected Air Canada's argument that the chatbot operated as a separate legal entity, ruling that companies remain fully responsible for information provided by their AI customer service systems.
The case established significant legal precedent regarding corporate liability for AI-generated customer communications, with the tribunal emphasizing that customers cannot reasonably be expected to distinguish between information provided by human representatives and AI systems when both are presented as official company guidance. Legal experts noted that the decision could have broad implications for how companies deploy customer-facing AI systems, potentially requiring more robust oversight and accuracy controls. The ruling was widely reported as one of the first cases where a court explicitly held a company liable for promises made by its AI chatbot, setting expectations that organizations cannot disclaim responsibility for their automated customer service interactions.
The incident highlighted broader challenges in the airline industry's adoption of AI customer service tools, where automated systems may generate responses that sound authoritative but lack grounding in actual company policies. Following the tribunal's decision, the case became a frequently cited example in discussions about AI governance, corporate responsibility, and the need for companies to implement safeguards preventing their chatbots from fabricating non-existent policies or services.
Root Cause
Air Canada's customer service chatbot fabricated a bereavement fare policy that did not exist, telling the customer he could book a full-fare ticket and retroactively apply for a bereavement discount within 90 days. No such policy existed.
Mitigation Analysis
An audit trail linking the chatbot's responses to its training data would have revealed the hallucinated policy had no basis in Air Canada's actual fare rules. Provenance tracking of the specific knowledge base version and retrieval context used to generate the response would have allowed Air Canada to identify and correct the hallucination before it affected customers. This case also demonstrates the need for output monitoring on customer-facing AI systems.
Litigation Outcome
The Civil Resolution Tribunal of British Columbia ruled that Air Canada was liable for its chatbot's misrepresentations. Tribunal member Christopher Rivers rejected Air Canada's argument that the chatbot was a "separate legal entity" responsible for its own actions, ruling the airline was responsible for all information on its website, whether from a static page or a chatbot.
Lessons Learned
Companies are legally liable for information provided by their AI chatbots. The "it's just a chatbot" defense does not hold. Customer-facing AI systems should have their outputs validated against authoritative company policies.
Sources
Air Canada ordered to pay customer who was misled by chatbot
BBC · Feb 22, 2024 · news