this post was submitted on 22 Oct 2024
88 points (94.0% liked)

Electric Vehicles

3219 readers
101 users here now

A community for the sharing of links, news, and discussion related to Electric Vehicles.

Rules

  1. No bigotry - including racism, sexism, ableism, homophobia, transphobia, or xenophobia.
  2. Be respectful, especially when disagreeing. Everyone should feel welcome here.
  3. No self-promotion
  4. No irrelevant content. All posts must be relevant and related to plug-in electric vehicles — BEVs or PHEVs.
  5. No trolling
  6. Policy, not politics. Submissions and comments about effective policymaking are allowed and encouraged in the community, however conversations and submissions about parties, politicians, and those devolving into general tribalism will be removed.

founded 1 year ago
MODERATORS
 

A fan of Tesla might think that the automaker just can't catch a break when it comes to its autonomous driving tech. It’s already subject to several federal investigations over its marketing and deployment of technologies like Autopilot and Full Self-Driving (FSD), and as of last week, we can add another to the list involving around 2.4 million Tesla vehicles. This time, regulators are assessing the cars' performance in low-visibility conditions after four documented accidents, one of which resulted in a fatality.

The National Highway Traffic Safety Administration (NHTSA) says this new probe is looking at instances when FSD was engaged when it was foggy or a lot of dust was in the air, or even when glare from the sun blinded the car’s cameras and this caused a problem.

What the car can "see" is the big issue here. It's also what Tesla bet its future on.

you are viewing a single comment's thread
view the rest of the comments
[–] jqubed@lemmy.world 5 points 2 weeks ago (4 children)

Unlike the vast majority of its competitors that are giving their cars with autonomous driving capabilities more ways to “see” their surroundings, Tesla removed ultrasonic and other types of sensors in favor of a camera-only approach in 2022.

This means there isn’t really any redundancy in the system, so if a Tesla with FSD enabled drives through dense fog, it may not have an easy time keeping track of where the road is and staying on it. Vehicles that not only have cameras but also radar and lidar will make more sense of their environment even through dense fog, although these systems are also affected by the elements. Inclement weather seems to sometimes make FSD go rogue.

I didn’t realize they were using other sensors in the past and dropped them on newer models.

Older Teslas had a combination of radar and cameras for Autopilot and driver assistance systems. With newer software versions launched after Tesla went down the "Pure Vision" route, it disabled the sensors in the older cars that had them from the factory. So even if you have FSD enabled in an older Tesla that has more than just cameras, only the cameras will be used when the car is driving itself.

🤦‍♂️

Didn’t want to develop two different versions of software I guess?

[–] XeroxCool@lemmy.world 3 points 2 weeks ago

I thought they canceled a contract for an outsourced system

[–] notfromhere@lemmy.ml 1 points 2 weeks ago (1 children)

Isn’t vision cameras the only sensor we have to recognize lane markings? This article is bunk making it seem like that’s not industry standard. RADAR can’t see paint on the road. My understanding is neither can LiDAR well enough for real-time lane markings at highway speeds.

[–] elbarto777@lemmy.world 1 points 4 days ago

It's not only about seeing the markings. It's also about recognizing potential colliding objects in less than ideal scenarios.

[–] Oderus@lemmy.world 1 points 2 weeks ago

If FSD noticed poor weather conditions, it will prompt you to take over as it will not just drive you off the road.

[–] Bell@lemmy.world -1 points 2 weeks ago (1 children)

The problem was the different sensors could sometimes disagree. Like, vision sees an obstacle but radar isn't picking it up...which one does the software believe?

And if you think vision has problems with things like rain and fog, try radar or lidar!

Not mentioning the downsides of the other sensors always makes me suspicious of an article.

The key point of going vision-only is that: its what humans do every day. Articles that leave that out also disappoint me.

[–] bladerunnerspider@lemmy.world 1 points 2 weeks ago

It's called consensus. Have three sensors and each get a vote. Typically these sensors are the same and thus can detect a failure or incorrect reading of one. This idea is used in IT around data backups and RAID configurations as well as aviation. And .. I personally would just favor the radar. If vision says go and radar says stop... stop and avoid hitting that firetruck parked on the highway. Or that motorcyclist. Or any other bizarre vision-only, fatal crashes that this system has wrought.

Also humans can hear things. So, not just vision.