fbpx

Many of us exchange communications nowadays with chatbots on websites. Numerous companies have incorporated them onto their websites to allow for 24/7 access to the company and the ability to provide consumers with information as and when requested.

This has resulted in a recent case in Canada whereby Air Canada has been sued based upon a misleading response provided by its website chatbot.

On the day Jack Moffatt’s grandmother died, Moffatt visited Air Canada’s website to book a flight from Vancouver to Toronto. Unsure of how Air Canada’s bereavement rates worked, Moffatt asked Air Canada’s chatbot to explain. The chatbot encouraged Moffatt to book a flight immediately and then request a refund within 90 days. In reality, Air Canada’s policy explicitly stated that the airline would not provide refunds for bereavement travel after the flight is booked. Moffatt attempted to follow the chatbot’s advice and requested a refund but was shocked that the request was rejected.

Chatbot stated “if you need to travel immediately or have already travelled and would like to submit your ticket for a refund bereavement rate, kindly do so within 90 days of the day your ticket was issued by completing our ticket refund application form”. Mr Moffatt tried for months to convince Air Canada that he was owed the refund. Air Canada argued that because the chatbot responded elsewhere during the conversation that linked a page with the actual bereavement policy, Mr Moffatt should have known that bereavement rates could not be requested retroactively. Instead of a refund, the best Air Canada was willing to offer was to provide a promise to update the chatbot and to offer Mr Moffatt a CAD$200.00 coupon to use on a future flight.

Mr Moffatt did not consider that this was an appropriate response or terms of settlement and commenced proceedings in Canada’s Civil Resolution Tribunal. According to Air Canada, Mr Moffatt should never have trusted their chatbot and the airline should not be liable for the chatbot’s misleading information because, get this, Air Canada’s counsel argued that ”the chatbot is a separate legal entity that is responsible for its own actions.”. This was the first time a Canadian company had tried to argue that it was not liable for information provided by the chatbot. The Tribunal member who heard the case, Christopher Rivers, decided in Mr Moffatt’s favour and called Air Canada’s defence ‘remarkable’.  Mr Rivers wrote in his decision “Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives including a chatbot. It does not explain why it believes that is the case or why the web page titled ‘bereavement travel’ was inherently more trustworthy than its chatbot.”

Mr Rivers further found that Mr Moffatt had no reason to believe that one part of Air Canada’s website would be accurate and the other would not. Mr Rivers further stated in his reasons “Air Canada does not explain what customers should have to double check information found in one part of its website on another part of its website.”. In the end, Mr Rivers ruled that Mr Moffatt was entitled to a partial refund of CAD$650.88 of the original fare which was what the chatbot had advised as well as additional damages to cover the interest on the airfare and Mr Moffatt’s tribunal fees.

Laws in Canada in many circumstances are comparable to those in Australia, the same risks apply to Australian companies that use chatbots on their website. Artificial intelligence such as chatbots whilst can be useful and helpful to assist customers, are effectively part of the company and the information, representations or disclosures made by the chatbot, the company can be held liable for.

In a society where the 24/7 mentality is growing and becoming an expectation for the provision of customer service, companies are fraught with problems if they rely upon artificial intelligence that is either (a) not updated or (b) not sophisticated and technically proficient enough to ensure it correctly answers every question posed to it.

In developments in Air Canada’s website, it is noted that after this decision, Air Canada has disabled its chatbot. Air Canada’s chief information officer Mel Crocker advised that Air Canada had launched its chatbot as an AI experiment. It was initially used to lighten the load on Air Canada’s call centre when flights experienced unexpected delays or cancellations. Its use was expanded over time as it was seen to assist and decrease the need for staff. This is a salient warning to businesses when seeking to use AI to limit labour costs that there are inherent risks that could be posed by doing so.

As consumers we must be wary when interfacing with AI and our ever-growing expectation for companies to be able to provide customer service on a 24/7 basis. Technology of this nature can and clearly has on many occasions now made fatal mistakes in its delivery of information to customers.

If you feel that you have been mislead by a company through either its artificial intelligence or the provision of information by representatives or employees of the company, Lynn and Brown Lawyers are well positioned to assist you in recovering appropriate compensation.

About the Author: This article has been authored by Steven Brown. Steven is a Perth lawyer and director, and has over 20 years’ experience in legal practice and practices in commercial law, dispute resolution and estate planning.

Newsletter

Name(Required)
Email(Required)
This field is for validation purposes and should be left unchanged.

Fact Sheets

Meet Our Authors

Related Articles

Australia’s family law system is about to undergo its most significant changes for about 20 years. The reforms are aimed at modernizing and simplifying the...

Read Blog

The legal cases surrounding the conduct of by Bruce Lehrmann in raping Brittany Higgins on a Federal Minister’s couch in Parliament House in 2019 appear...

Read Blog

As a result of the current property market in Perth, we have seen a substantial rise in co-ownership arrangements for real property. These can come...

Read Blog