Article 6JQ72 Air Canada Must Honor Refund Policy Invented by Airline's Chatbot

Air Canada Must Honor Refund Policy Invented by Airline's Chatbot

by
hubie
from SoylentNews on (#6JQ72)

DannyB writes:

Air Canada must honor refund policy invented by airline's chatbot

Air Canada appears to have quietly killed its costly chatbot support.

After months of resisting, Air Canada was forced to give a partial refund to a grieving passenger who was misled by an airline chatbot inaccurately explaining the airline's bereavement travel policy.

[....] Air Canada argued that because the chatbot response elsewhere linked to a page with the actual bereavement travel policy, Moffatt should have known bereavement rates could not be requested retroactively. Instead of a refund, the best Air Canada would do was to promise to update the chatbot and offer Moffatt a $200 coupon to use on a future flight.

[....] According to Air Canada, Moffatt never should have trusted the chatbot and the airline should not be liable for the chatbot's misleading information because Air Canada essentially argued that "the chatbot is a separate legal entity that is responsible for its own actions," a court order said.

Experts told the Vancouver Sun that Moffatt's case appeared to be the first time a Canadian company tried to argue that it wasn't liable for information provided by its chatbot.

From the linked court order.

Negligent Misrepresentation

24. While Mr. Moffatt does not use the words specifically, by saying they relied on Air Canada's chatbot, I find they are alleging negligent misrepresentation. Negligent misrepresentation can arise when a seller does not exercise reasonable care to ensure its representations are accurate and not misleading.

25. To prove the tort of negligent misrepresentation, Mr. Moffatt must show that Air Canada owed them a duty of care, its representation was untrue, inaccurate, or misleading, Air Canada made the representation negligently, Mr. Moffatt reasonably relied on it, and Mr. Moffatt's reliance resulted in damages.

Should a company be held to the hallucinations of their AI chatbot? This could slow the adoption of AI chatbots that are not ready for actual public interaction. It could delay executives the ability to save money in this quarter.

Original Submission

Read more of this story at SoylentNews.

External Content
Source RSS or Atom Feed
Feed Location https://soylentnews.org/index.rss
Feed Title SoylentNews
Feed Link https://soylentnews.org/
Feed Copyright Copyright 2014, SoylentNews
Reply 0 comments