this post was submitted on 21 Aug 2023
510 points (95.5% liked)

Technology

58143 readers
5203 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Tesla knew Autopilot caused death, but didn't fix it::Software's alleged inability to handle cross traffic central to court battle after two road deaths

you are viewing a single comment's thread
view the rest of the comments
[–] Zink@programming.dev 2 points 1 year ago (1 children)

I did not know that about Mercedes, so I had to go read about it. Level 3 is huge because that’s when the system is approved to not have constant human monitoring. It’s the difference between being able to read a book or use your phone on a boring trip, even if it might not get you fully door to door on many trips.

It can’t drive you home drunk, and you can’t sleep in your car (you have to be available to take over when requested) but it’s a huge jump in most practical usage.

[–] renohren@partizle.com 2 points 1 year ago

Realisticaly, i think FSD has the potential to be level 3 officially and probably some car makers have the tech to do it too BUT in the EU, if the car has a level 3 autonomous driving, the car maker becomes legally responsible of accidents when the driving conditions are met ( most EU states limit it to highways). For the time being,only Mercedes had the courage to try it (probably because they have ample knowledge of driving assistance through their trucking production.)