this post was submitted on 15 Feb 2024
268 points (98.6% liked)

Canada

7203 readers
310 users here now

What's going on Canada?



Communities


🍁 Meta


πŸ—ΊοΈ Provinces / Territories


πŸ™οΈ Cities / Local Communities


πŸ’ SportsHockey

Football (NFL)

  • List of All Teams: unknown

Football (CFL)

  • List of All Teams: unknown

Baseball

Basketball

Soccer


πŸ’» Universities


πŸ’΅ Finance / Shopping


πŸ—£οΈ Politics


🍁 Social and Culture


Rules

Reminder that the rules for lemmy.ca also apply here. See the sidebar on the homepage:

https://lemmy.ca


founded 3 years ago
MODERATORS
 

Jake Moffatt was booking a flight to Toronto and asked the bot about the airline's bereavement rates – reduced fares provided in the event someone needs to travel due to the death of an immediate family member.

Moffatt said he was told that these fares could be claimed retroactively by completing a refund application within 90 days of the date the ticket was issued, and submitted a screenshot of his conversation with the bot as evidence supporting this claim.

The airline refused the refund because it said its policy was that bereavement fare could not, in fact, be claimed retroactively.

Air Canada argued that it could not be held liable for information provided by the bot.

you are viewing a single comment's thread
view the rest of the comments
[–] Zellith@kbin.social 62 points 9 months ago* (last edited 9 months ago) (2 children)

Air Canada argued that it could not be held liable for information provided by the bot

Lol. Of course they'd say that. Perhaps hire people? Or would they also argue they couldn't be held liable for their mistakes and misinformation?

[–] SlopppyEngineer@lemmy.world 16 points 9 months ago (3 children)

If they get the precedent they are not responsible for what the AI chat bot says, then this goes for any chat box on any site and they all become worthless. Any chat bot gets a disclaimer basically saying "this thing is a dirty lier and nothing it says matters." People will start to call human customer service to confirm what the chat bot said and the savings in employee costs are gone.

Seems a bad long term strategy.

[–] psvrh@lemmy.ca 10 points 9 months ago

Seems a bad long term strategy.

It's not a long term strategy. The person who made this decision is thinking about their quarterly or yearly bonus. By the time the problems hit, they've long since cashed out.

[–] meat_popsicle@sh.itjust.works 7 points 9 months ago

CS will be a multi modal chatbot too, just with a voice. I don’t think they want any human support at all. To a business, the only reason overhead exists is to cut it, and support has always been overhead.

[–] Kichae@lemmy.ca 6 points 9 months ago

Way, way fewer people will call CS than will just ignore the warning.

Once we become acclimated to things like this, we stop complaining, and let the greedy fuckers win.

[–] xmunk@sh.itjust.works 16 points 9 months ago

They're trying to cheap out on real human support personnel - Chat bots are clearly not a suitable replacement.

Fuck'em.