this post was submitted on 04 Jan 2024
715 points (97.4% liked)

Technology

60112 readers
2514 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] psud@lemmy.world 2 points 11 months ago* (last edited 11 months ago) (2 children)

That's all the people who were asleep on the highway or driving at very high speed in town

~~The recent versions don't allow either of those behaviours now, so those crashes aren't happening anymore.~~

Full self driving doesn't do that

And the deaths I'm interested in are these ones being caused by FSD, not lane keeping and cruise control. Loads of brands do lane keeping and cruise control and implement it no better than Tesla

[–] Zink@programming.dev 2 points 11 months ago (1 children)

But does FSD change the logic for the lane keeping and the speed & distance?

Aren’t one of the features “navigate on autopilot?”

[–] psud@lemmy.world 2 points 11 months ago (1 children)

It is quite different. Navigate on autopilot is lane keeping, cruise control, and automatic highway exits. FSD tries to do all driving tasks - turns at stop signs, at lights, keeping to the correct side on roads with no centre line, negotiating with oncoming traffic on narrow roads...

[–] Zink@programming.dev 1 points 11 months ago (1 children)

Yeah it adds more capabilities for sure. But if you are on a moderate to high speed road where autopilot works fine, then is the control logic any different?

Obviously there are various tours of accidents that autopilot would never get the chance to cause, like maybe turning right at an intersection and hitting a pedestrian. But do they act differently on a main road where teslas have done things like run into tractor trailers?

[–] psud@lemmy.world 2 points 11 months ago* (last edited 11 months ago)

The one that hit a tractor trailer was years ago. They are far better now, specifically they see low contrast stuff now and that's on autopilot. The biggest difference to the user will be the ability to have hands off the controls.

It isn't the same though. FSD is written completely differently to autopilot. It's a different program.

Other accidents it won't have on those roads include falling asleep and running off the road, or being surprised by someone braking ahead and running into them

I'm sure it will be worse than humans around animals on the road. I wonder if it will see a wombat before it hits it.

[–] NotMyOldRedditName@lemmy.world 2 points 11 months ago (1 children)

Just keep in mind that FSD is only as safe as they claim because it's supervised.

I would hope that even a reasonably working system would be better with a human vigilantly watching it than a human driving regularly.

The system would have to be really bad to be worse than that.