this post was submitted on 14 Aug 2023
443 points (97.0% liked)

Technology

58173 readers
5364 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

you are viewing a single comment's thread
view the rest of the comments
[–] r00ty@kbin.life 17 points 1 year ago (2 children)

I'm not so sure disengaging autopilot because the driver's hands were not on the wheel while on a highway, is the best option. Engage hazard lights, remain in lane (or if able move to the slowest lane) and come to a stop. Surely that's the better way?

Just disengaging the autopilot seems like such a copout to me. Also the fact it disengaged right at the end "The driver was in control at the moment of the crash" just again feels like bad "self" driving. Especially when the so-called self-driving is able to come to a stop as part of its software in other situations.

Also if you cannot recognize an emergency vehicle (I wonder if this was a combination of the haze and the usually bright emergency lights saturating the image it was trying to analyse) it's again a sign you shouldn't be releasing this to the public. It's clearly just not ready.

Not taking any responsibility away from the human driver here. I just don't think the behaviour was good enough for software controlling a car used by the public.

Not to mention, of course, the reason for suing Tesla isn't because they think they're more liable. It's because they can actually get some money from them.

[–] I_LOVE_VEKOMA_SLC@sh.itjust.works 0 points 1 year ago (1 children)

The video is very thorough and goes into the hazy video caused by the flashing lights being one of the issues.

[–] r00ty@kbin.life 2 points 1 year ago

The question here is, could you see there was a reason to stop the car significantly (more than 3 seconds) before the autopilot did? If we can recognize it through the haze the autopilot must too.

Moreover, it needs to now be extra good at spotting vehicles in bad lighting conditions because other sensors are removed on newer Teslas. It only has cameras to go on.