this post was submitted on 28 Aug 2023
388 points (97.8% liked)

Technology

59446 readers
4974 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Tesla braces for its first trial involving Autopilot fatality::Tesla Inc is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk's assertions about the technology.

you are viewing a single comment's thread
view the rest of the comments
[–] RecallMadness@lemmy.nz 6 points 1 year ago* (last edited 1 year ago) (2 children)

And when autopilot is at fault for an accident or fatality, who should be held responsible?

Just because it’s better, shouldn’t absolutely them of responsibility when it fails.

[–] severien@lemmy.world 4 points 1 year ago

It's an interesting question. But I would be disappointed if the self-driving was basically killed by the legal questions, since it has a huge potential to save lives.

[–] excel@lemmy.megumin.org 2 points 1 year ago (2 children)

The driver is always responsible for using the tools within the car correctly and maintaining control of the vehicle at all times.

Either way the driver would be at fault. However, the driver might be able to make a (completely separate) case that the car’s defects made control impossible, but since the driver always had the option to disable self-driving, I doubt that would go anywhere.

Just like you don’t get off the hook if your cruise control causes an accident… and it doesn’t matter how much Tesla lied about what it may or may not be capable of, because at the end of the day it’s always the driver’s responsibility to know the limitations of the vehicle and disable the feature and take control when necessary.

[–] RecallMadness@lemmy.nz 6 points 1 year ago* (last edited 1 year ago) (1 children)

Which is exactly what this case is claiming, that the software is defective.

And what happens when we progress beyond Level 2 or 3 automation? Then the car is making choices for the driver, choices the driver may not have any say in or realistically be capable of reacting to in an emergency?

Deferring responsibility to the driver under any scenario is a cop-out. We have a long history of engineering qualifications and regulations to ensure safety of the populace, engineers and architects design structures to be safe, plumbers have to plumb to code, heck even cars themselves have a mile long list of compliance requirements. All to ensure the thing that companies build aren’t killing the population, and when they do someone is responsible.

Yet as soon as we start talking about software, “not my problem dawg.”.

[–] tony@lemmy.hoyle.me.uk 2 points 1 year ago (1 children)

This is a guy who was using a glorified cruise control (which is all AP is) at high speed whilst watching a DVD instead of looking at the road.

The software can only help so much. There's a reason why there are laws requiring attentiveness checks now.. people are reckless

[–] Honytawk@lemmy.zip 1 points 1 year ago

People are only reckless because they believe Teslas false marketing claims.

The car doesn't "just drive itself", it isn't even close to "just driving itself". The advertising claiming so is much more at fault than the driving watching a movie.

[–] theneverfox@pawb.social 1 points 1 year ago

So you're correct to call it a tool, with this level of automation the driver is ultimately the operator. But you're missing something

Did you misuse the tool, did they sell you a bad tool, or did their instructions cause the tool to be misused?

The first is as you said - if I make and sell you a circular saw and you cut your finger off being an idiot, that's on you.

If the thing flew apart under normal use, that's on me - it's likely my responsibility, and possibly negligence.

If the box or user manual said it is for wood and metal use, and it's actually entirely unsafe for metal use, that's probably negligence on my part

Cruise control doesn't unexpectedly jerk your wheel to the side, if it did and you could prove you were using it reasonably and in the recommended way, you'd almost definitely get off the hook