Odai Khasawneh, PhD’s Post

View profile for Odai Khasawneh, PhD, graphic

Associate Teaching Professor at The Sheldon B. Lubar College of Business at the University of Wisconsin-Milwaukee

When Technology Makes a Mistake Nowadays, it is nearly impossible to find anyone who has not heard of AI. A technology that is so smart, so advanced, it can sift through millions of data records in seconds and find answers to questions that otherwise used to be humanly impossible to answer. Many companies are riding the wave of implementing this new technology to help boost their productivity and efficiency to catch up with their competitors and gain that elusive market advantage.  With all of the advantages that this technology brings, it comes with some shortcomings that companies and people might want to consider when using this technology. One of these shortcomings is the policies on liability and accountability when this technology has what is known as AI hallucination (incorrect prediction, false positive, or false negative).  An example of AI hallucination is what happened with Jake Moffat in 2020. Jake Moffat booked a flight from Vancouver to Toronto to attend his grandma’s funeral. Before booking, he checked Air Canada’s policy on bereavement fares (some airline offers a discount for individuals who need to travel due to the death of a family member). The information provided by the chatbot on Air Canada’s website suggested that Jake could apply for the discount retroactively. However, when Jake applied for the discount, Air Canada rejected his claim because the chatbot was wrong in its suggestion. In the Civil Resolution Tribunal record, Air Canada argued that “it cannot be held liable for information provided by one of its agents, servants, or representatives- including a chatbot.” suggesting that the chatbot, not them, should be held responsible. Thankfully, the Civil Resolution Tribunal rejected Air Canada’s argument and ordered them to pay Jake $812.02 (650.88 in damages and 161.14 in fees).  There is no denying the value that new technology brings. However, there are a lot of unknowns that come with new technologies. A study by the Pew Research Center shows that excitement over AI technology decreased from 18% in 2021 to 10% in 2023, while concerns over AI technology increased from 37% in 2021 to 52% in 2023.  Unfortunately, examples of these errors are happening every day. In Jake’s case, the issue was somewhat clear, and the monetary cost was just over $800, but what if this ‘hallucination’ was a bad investment suggestion for your life savings? or a health recommendation? … who will be held responsible if AI gets things wrong? Because the cost and impact of some of these errors can be much higher.

Roy Casagranda, PhD

Author of The Blood Throne of Caria

3mo

To me the sad part in your story is that Canada has such an amazing process for resolving transactional injustice. I can't imagine living in a place with law and accountability. What must that be like?

Hamza Benamar

Chief Financial Officer @ Private Equity SaaS

3mo

At times Air Canada itself feels like a hallucination.

Al Bellamy

Professor at Eastern Michigan University

3mo

Very informative

Gary Engen

Services Manager | Area Manager | Multi-Unit Turnaround | High Performance Team Builder and Manager

3mo

Insightful!

See more comments

To view or add a comment, sign in

Explore topics