this post was submitted on 17 Feb 2024
234 points (97.6% liked)

Technology

58123 readers
4614 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Air Canada must honor refund policy invented by airline’s chatbot::Air Canada appears to have quietly killed its costly chatbot support.

you are viewing a single comment's thread
view the rest of the comments
[–] MoonManKipper@lemmy.world 19 points 7 months ago (3 children)
[–] SorteKanin@feddit.dk 6 points 7 months ago

Ironically, using an LLM AI would probably summarize it better.

[–] PapaStevesy@midwest.social 4 points 7 months ago* (last edited 7 months ago)

It was at 0 votes and I felt bad for the bot, upvoted it, then started reading it. I quickly changed it to a downvote. Bad bot.

[–] SatanicNotMessianic@lemmy.ml 3 points 7 months ago (1 children)

The summarizer could do better by just copying over the entire text of the article. This was incoherent. Its only utility is for people who can’t or don’t click through.

You know how they say an infinite amount of monkeys in an infinite amount of time could produce the works of Shakespeare?

This is five monkeys in fifteen minutes.

[–] Womble@lemmy.world 3 points 7 months ago (1 children)

The irony is that this is one of the things that LLMs are really good at. You could run a small local cpu only model and get it to give a far better summary than this bot does.

[–] Speculater@lemmy.world 1 points 7 months ago

My local model gave this summary. Granted, I didn't shape the prompt well, but at least we know why he went to court:

After months of resistance, Air Canada was forced to partially refund a grieving passenger named Jake Moffatt who was misled by the airline's chatbot regarding their bereavement travel policy. The chatbot incorrectly stated that Moffatt could request a refund within 90 days after booking his flight to attend his grandmother's funeral. In reality, Air Canada's policy explicitly stated that refunds would not be granted for such travel once the ticket was purchased. Despite trying for months to convince the airline of their mistake, Moffatt filed a small claims complaint in Canada's Civil Resolution Tribunal. The tribunal ruled in favor of Moffatt, ordering Air Canada to pay him $650.88 CAD (about $482 USD) and additional damages for interest on the fare and tribunal fees. As of Friday, there appeared to be no chatbot support available on Air Canada's website, suggesting that the airline has disabled the chatbot following this incident.