this post was submitted on 18 Oct 2024
771 points (98.4% liked)

Technology

58774 readers
3357 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

you are viewing a single comment's thread
view the rest of the comments
[–] Jrockwar@feddit.uk 1 points 1 day ago* (last edited 1 day ago)

You can't measure this, because it has drivers behind the wheel. Even if it did three "pedestrian-killing" mistakes every 10 miles, chances are the driver will catch every mistake per 10000 miles and not let it crash.

But on the other hand, if we were to measure every time the driver takes over the number would be artificially high - because we can't predict the future and drivers are likely to be overcautious and take over even in circumstances that would have turned out OK.

The only way to do this IMO is by

  • measuring every driver intervention
  • only letting it be driverless and marketable as self-driving when it achieves a very low number of interventions ( < 1 per 10000 miles?)
  • in the meantime, market it as "driver assist" and have the responsibility fall into the driver, and treat it like the "somewhat advanced" cruise control that it is.