this post was submitted on 15 Feb 2024
268 points (98.6% liked)

Canada

7203 readers
271 users here now

What's going on Canada?



Communities


🍁 Meta


πŸ—ΊοΈ Provinces / Territories


πŸ™οΈ Cities / Local Communities


πŸ’ SportsHockey

Football (NFL)

  • List of All Teams: unknown

Football (CFL)

  • List of All Teams: unknown

Baseball

Basketball

Soccer


πŸ’» Universities


πŸ’΅ Finance / Shopping


πŸ—£οΈ Politics


🍁 Social and Culture


Rules

Reminder that the rules for lemmy.ca also apply here. See the sidebar on the homepage:

https://lemmy.ca


founded 3 years ago
MODERATORS
 

Jake Moffatt was booking a flight to Toronto and asked the bot about the airline's bereavement rates – reduced fares provided in the event someone needs to travel due to the death of an immediate family member.

Moffatt said he was told that these fares could be claimed retroactively by completing a refund application within 90 days of the date the ticket was issued, and submitted a screenshot of his conversation with the bot as evidence supporting this claim.

The airline refused the refund because it said its policy was that bereavement fare could not, in fact, be claimed retroactively.

Air Canada argued that it could not be held liable for information provided by the bot.

top 43 comments
sorted by: hot top controversial new old
[–] some_guy@lemmy.sdf.org 112 points 9 months ago (2 children)

It’s your fucking system. You’re liable for what it says.

I got a bereavement care when my father died the night before my flight to see him. The phone agent did this without my asking for it. Humans good, chatbots bad.

[–] technohacker@programming.dev 24 points 9 months ago

Oh man, moments like this when my faith in humanity is restored. I am sorry for your loss

[–] Nouveau_Burnswick@lemmy.world 5 points 9 months ago

I got offered a bereavement fare that was higher than the Google flights posted fare, and limited to Mon-Thurs flights.

[–] Godort@lemm.ee 77 points 9 months ago (3 children)

Air Canada probably spent more trying to fight this claim rather than just issuing payment when the chatbot logs were sent in

[–] Evkob@lemmy.ca 64 points 9 months ago (4 children)

I wonder how anyone in their right mind would propose the defense "we can't be held liable for what the chatbot we purposefully put on our website said". Did Air Canada's lawyers truly think this would fly?

If you don't want to be held to AI hallucinations, don't put an AI chatbot on your website, seems easy enough.

[–] Monument@lemmy.sdf.org 18 points 9 months ago

My organization won’t even allow auto translation widgets on our site. Instead, we refer people to using web translation services on their own, with clear language that says we’re not liable for third party mistranslations. (In multiple languages, by a company that has signed an indemnity agreement with us if their translation becomes an issue.)

It’s a bit heavy-handed, but the lawyers hold more sway than the communications folks, and I don’t disagree with the approach – you don’t want users misunderstanding what your site says, and being able to blame you for it.

[–] Drusas@kbin.social 16 points 9 months ago

Probably not, but they're paid to try their best.

[–] Quexotic@infosec.pub 8 points 9 months ago

Lol... "Think this would fly" I see what you did there.

[–] Truck_kun@beehaw.org 6 points 9 months ago (1 children)

This article does not actually mention the chatbot being AI. As chatbots have been around for many years, it is possible this is just a normal non-'AI' chatbot that someone programmed that information with (potentially old information that had long since changed, but no one has updated).

Either way, they are liable for what it tells customers. If it is AI, well... no company should be using AI to make legally binding statements, or advertisements to customers (without human review).

At the moment, companies deploying an AI, should be doing so with AI as the product, not integrated into selling non-AI related products, or services.

[–] Evkob@lemmy.ca 6 points 9 months ago

You know what, you're completely right. Thanks for pointing that out, my brain just auto-completed that detail because of how prevalent "AI" is in the news these days.

Honestly though, if it's a more traditional chatbot that they had to program themselves, it's all the more embarrassing for Air Canada that they were trying to weasel themselves out of this.

[–] Drusas@kbin.social 13 points 9 months ago

I am completely certain that's the case. For them, this is more about precedent.

[–] Theharpyeagle@lemmy.world 9 points 9 months ago

Surely they're scared of more people realizing that saving these chats is important. How else will they get away with scummy practices?

[–] Zellith@kbin.social 62 points 9 months ago* (last edited 9 months ago) (2 children)

Air Canada argued that it could not be held liable for information provided by the bot

Lol. Of course they'd say that. Perhaps hire people? Or would they also argue they couldn't be held liable for their mistakes and misinformation?

[–] SlopppyEngineer@lemmy.world 16 points 9 months ago (3 children)

If they get the precedent they are not responsible for what the AI chat bot says, then this goes for any chat box on any site and they all become worthless. Any chat bot gets a disclaimer basically saying "this thing is a dirty lier and nothing it says matters." People will start to call human customer service to confirm what the chat bot said and the savings in employee costs are gone.

Seems a bad long term strategy.

[–] psvrh@lemmy.ca 10 points 9 months ago

Seems a bad long term strategy.

It's not a long term strategy. The person who made this decision is thinking about their quarterly or yearly bonus. By the time the problems hit, they've long since cashed out.

[–] meat_popsicle@sh.itjust.works 7 points 9 months ago

CS will be a multi modal chatbot too, just with a voice. I don’t think they want any human support at all. To a business, the only reason overhead exists is to cut it, and support has always been overhead.

[–] Kichae@lemmy.ca 6 points 9 months ago

Way, way fewer people will call CS than will just ignore the warning.

Once we become acclimated to things like this, we stop complaining, and let the greedy fuckers win.

[–] xmunk@sh.itjust.works 16 points 9 months ago

They're trying to cheap out on real human support personnel - Chat bots are clearly not a suitable replacement.

Fuck'em.

[–] Skullgrid@lemmy.world 49 points 9 months ago (1 children)

Air Canada argued that it could not be held liable for information provided by the bot.

the (probably legally required) system we set up just straight up lied, not our fault.

[–] Drusas@kbin.social 16 points 9 months ago (2 children)

Are chatbots really legally required?

[–] Skullgrid@lemmy.world 8 points 9 months ago

I am assuming the customer should legally have a way to contact a company.

Companies try to make this obligation cost less and less by using automation and self service.

Source : worked on the customer service platform for a fortune 500 company.

[–] Evkob@lemmy.ca 8 points 9 months ago (1 children)

Yeah I'm also confused as to what they mean by this.

[–] Skullgrid@lemmy.world 6 points 9 months ago

https://lemmy.world/comment/7546839

I am assuming the customer should legally have a way to contact a company.

Companies try to make this obligation cost less and less by using automation and self service.

Source : worked on the customer service platform for a fortune 500 company.

[–] SamuelRJankis@lemmy.world 43 points 9 months ago (2 children)

It's amazing that a 7 billion dollar company goes to court to fight someone for $800. Aside from obviously being in the wrong.

...awarding $650.88 in damages for negligent misrepresentation.

$36.14 in pre-judgment interest and $125 in fees

[–] nova_ad_vitum@lemmy.ca 40 points 9 months ago* (last edited 9 months ago) (1 children)

They're not fighting for the $800. They're fighting for the right to continue to use their shitty chatbot to reduce their support staff costs while not being liable for any bullshit it tells people.

There will be cases like this in every jurisdiction.

[–] CanadianCorhen@lemmy.ca 8 points 9 months ago (1 children)

Exactly.

If the court hand found any other way, then any time the chatbot makes a mistake, they just wash their hands of it and let the consumer takes the hit.

This means they are responsible for what the chatbot says, and is at least moderately sane.

[–] nova_ad_vitum@lemmy.ca 4 points 9 months ago

If the court hand found any other way, then any time the chatbot makes a mistake, they just wash their hands of it and let the consumer takes the hit.

It would have been just a matter of time the chatbot started making "mistakes" that financially benefitted the company more and more.

This means they are responsible for what the chatbot says, and is at least moderately sane.

Does this decision carry any precedent? It was a tribunal, not a court.

[–] ahal@lemmy.ca 16 points 9 months ago

Nothing to do with the money and everything to do with the precedent. Glad it didn't work out for them.

[–] umbrella@lemmy.ml 35 points 9 months ago (1 children)
[–] jasep@lemmy.world 19 points 9 months ago* (last edited 9 months ago) (2 children)

Chatbot = basically free

Yearly employee = probably about $100k incl benefits

Should they hire a person? Absolutely. Will they if they can get away with it? Aww hell naw.

[–] umbrella@lemmy.ml 7 points 9 months ago (1 children)

Yeah, I want to see who will buy the garbage they are selling when no one has money.

[–] Drusas@kbin.social 5 points 9 months ago

Doesn't matter to their shareholders. They'll have already made their money.

[–] RootBeerGuy@discuss.tchncs.de 7 points 9 months ago (3 children)

You are pretty optimistic they would pay $100k for this job. It is probs far less, which makes this even worse, they are not saving that much money really.

[–] xmunk@sh.itjust.works 4 points 9 months ago

It costs your employer about 30% more to employ you than what you earn (approximately) - so hiring someone for 75k will usually cost a company somewhere around 100k.

[–] Deceptichum@kbin.social 3 points 9 months ago* (last edited 9 months ago) (1 children)

It’s probably half that, but a chatbot can serve thousands of users whereas an employee can manage a few at a time.

[–] Mossheart@lemmy.ca 5 points 9 months ago

Confirmed. As someone who has led customer operations at large companies, the scale of chatbots to address a userbase is absurd. Companies are more than willing to take the hit to their reputation and customer goodwill in exchange for not needing to hire as much staff, train them, manage their schedules or deal with benefits and performance reviews. Cutting all that cost is an instaboner to execs and a nightmare to support managers who actually care about quality.

The amount of $700 judgements that Air Canada would need to be hit with to make replacing humans with chatbots a losing proposition is too high. It'll never happen.

Sadly, in my decade of experience, I've yet to see any bots able to reliably handle much beyond 'where's my order?'.

[–] jasep@lemmy.world 3 points 9 months ago (1 children)

I didn't say they would pay $100k. I said it would cost probably $100k including benefits - a full time employee cost isn't just their salary or hourly wage. There's lots of overhead cost to employ a person at a large company. Also keep in mind this is for a Canadian, so don't be thinking in USD. In CAD in the Toronto area (for example), it isn't unreasonable to think even a first line phone based customer service agent salary would be between $65-70k, then the employee expenses on top of that.

[–] baggins@lemmy.ca 2 points 9 months ago (1 children)

LOL customer service agents do NOT get paid $65-70k

[–] jasep@lemmy.world 0 points 9 months ago

Well, you're just wrong about that. I worked for a company that employed front of the line customer service in Toronto that paid $80k...in 2017! I'm not saying Air Canada does (I have no idea what AC wages are), but if their agents are part of a union, it's definitely possible.

But my point stands - if anyone thinks these companies are going to 'do the right thing' and hire real people when these AI chatbots exist and are so cheap, it's just not going to happen.

[–] Yerbouti@lemmy.ml 21 points 9 months ago (2 children)

Worst airline ever. You realize that when you try any other.

[–] ikidd@lemmy.world 5 points 9 months ago

And they'll cancel a flight on the slightest pretext. I've gone back to the hotel and waited 3 days while I watch from the room window flight after flight of other carriers taking off.

[–] Templa@beehaw.org 1 points 9 months ago

I'm curious to know more about your experiences. At home we try to only travel air canada and never had any issues. I've had so many terrible experiences with American, United and such that if I can choose air canada, I'll always do it.

[–] Templa@beehaw.org 1 points 9 months ago

I love that now the precedent exists and companies will think twice before adding chatbots like these.