this post was submitted on 18 Oct 2024
782 points (98.4% liked)

Technology

60123 readers
3654 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

you are viewing a single comment's thread
view the rest of the comments
[–] jlh@lemmy.jlh.name 62 points 2 months ago (5 children)

Humans know to drive more carefully in low visibility, and/or to take actions to improve visibility. Muskboxes don't.

[–] hannesh93@feddit.org 48 points 2 months ago (1 children)

They also decided to only use cameras and visual clues for driving instead of using radar, heat cameras or something like that as well.

It's designed to be launched asap, not to be safe

[–] mindaika@lemmy.dbzer0.com 12 points 2 months ago

I mean, that’s just good economics. I’m willing to bet someone at Tesla has done the calcs on how many people they can kill before it becomes unprofitable

[–] WheelcharArtist@lemmy.world 19 points 2 months ago

Muskboxes

like that

[–] sugar_in_your_tea@sh.itjust.works 8 points 2 months ago (2 children)

I'm not so sure. Whenever there's crappy weather conditions, I see a ton of accidents because so many people just assume they can drive at the posted speed limit safely. In fact, I tend to avoid the highway altogether for the first week or two of snow in my area because so many people get into accidents (the rest of the winter is generally fine).

So this is likely closer to what a human would do than not.

[–] nyan@lemmy.cafe 3 points 2 months ago (2 children)

The question is, is Tesla FSD's record better, worse, or about the same on average as a human driver under the same conditions? If it's worse than the average human, it needs to be taken off the road. There are some accident statistics available, but you have to practically use a decoder ring to make sure you're comparing like to like even when whoever's providing the numbers has no incentive to fudge them. And I trust Tesla about as far as I could throw a Model 3.

On the other hand, the average human driver sucks too.

Yeah, I honestly don't know. My point is merely that we should have the same standards for FSD vs human driving, at least initially, because they have a lot more potential for improvement than human drivers. If we set the bar too high, we'll just delay safer transportation.

[–] Jrockwar@feddit.uk 1 points 2 months ago* (last edited 2 months ago)

You can't measure this, because it has drivers behind the wheel. Even if it did three "pedestrian-killing" mistakes every 10 miles, chances are the driver will catch every mistake per 10000 miles and not let it crash.

But on the other hand, if we were to measure every time the driver takes over the number would be artificially high - because we can't predict the future and drivers are likely to be overcautious and take over even in circumstances that would have turned out OK.

The only way to do this IMO is by

  • measuring every driver intervention
  • only letting it be driverless and marketable as self-driving when it achieves a very low number of interventions ( < 1 per 10000 miles?)
  • in the meantime, market it as "driver assist" and have the responsibility fall into the driver, and treat it like the "somewhat advanced" cruise control that it is.
[–] III@lemmy.world 3 points 2 months ago (1 children)

low visibility, including sun glare, fog and airborne dust

I also see a ton of accidents when the sun is in the sky or if it is dusty out. \s

[–] sugar_in_your_tea@sh.itjust.works 1 points 2 months ago* (last edited 2 months ago)

Yup, especially at daylight savings time when the sun changes position in the sky abruptly.

Cameras are probably worse here, but they may be able to make up for it with parallel processing the poor data they get.

[–] _bcron@midwest.social 2 points 2 months ago* (last edited 2 months ago)

The median driver sure, but the bottom couple percent never miss their exit and tend to do boneheaded shit like swerving into the next lane when there's a stopped car at a crosswalk. >40,000 US fatalities in 2023. There are probably half a dozen fatalities in the US on any given day by the time the clock strikes 12:01AM on the west coast.

Edit: some more food for thought as I've been pondering:

FSD may or may not be better than the median driver (maybe this investigation will add to knowledge), but it's likely better than the worst drivers... But the worst drivers are the most likely to vastly overestimate their competence, which might lead to them actively avoiding the use of any such aids, despite those drivers being the ones who would see the greatest benefit from using them. We might be forever stuck with boneheaded drivers doing boneheaded shit