this post was submitted on 22 Oct 2024
88 points (94.0% liked)

Electric Vehicles

3219 readers
101 users here now

A community for the sharing of links, news, and discussion related to Electric Vehicles.

Rules

  1. No bigotry - including racism, sexism, ableism, homophobia, transphobia, or xenophobia.
  2. Be respectful, especially when disagreeing. Everyone should feel welcome here.
  3. No self-promotion
  4. No irrelevant content. All posts must be relevant and related to plug-in electric vehicles — BEVs or PHEVs.
  5. No trolling
  6. Policy, not politics. Submissions and comments about effective policymaking are allowed and encouraged in the community, however conversations and submissions about parties, politicians, and those devolving into general tribalism will be removed.

founded 1 year ago
MODERATORS
 

A fan of Tesla might think that the automaker just can't catch a break when it comes to its autonomous driving tech. It’s already subject to several federal investigations over its marketing and deployment of technologies like Autopilot and Full Self-Driving (FSD), and as of last week, we can add another to the list involving around 2.4 million Tesla vehicles. This time, regulators are assessing the cars' performance in low-visibility conditions after four documented accidents, one of which resulted in a fatality.

The National Highway Traffic Safety Administration (NHTSA) says this new probe is looking at instances when FSD was engaged when it was foggy or a lot of dust was in the air, or even when glare from the sun blinded the car’s cameras and this caused a problem.

What the car can "see" is the big issue here. It's also what Tesla bet its future on.

you are viewing a single comment's thread
view the rest of the comments
[–] NutWrench@lemmy.ml 11 points 2 weeks ago (2 children)

Maybe Tesla shouldn't be allowed to call their enhanced cruise control "autopilot." Everyone knows how "autopilots" are supposed to work.

[–] Tarquinn2049@lemmy.world 5 points 2 weeks ago

Well, actually, that's kind of the problem. It actually does more than what real autopilot does already. Autopilot in a plane can't help the plane not hit moving objects, it's not context aware at all. It just flies a pre-programmed route and executes pre-programmed maneuvers. Literally the first release was already better than what autopilot really is.

Planes are only safe because there is never supposed to be anything else anywhere near them. Which makes autopilot super easy. Which is why planes have had it since long before we had any context aware machines.

Also, if "roadspace" was treated the same as "airspace", including the amount of training and practice pilots have, as well as "road traffic controllers" like air traffic controllers. Self driving would have had no trouble right from the get-go. Pre-programmed routes, and someone making sure there is a specified gratuitous minimum space between each vehicle. And any violation being immediately harshly reprimanded...

Aitopilot is relatively easy compared to self-driving, if anything, calling it autopilot was being under ambitious.

[–] FlowVoid@lemmy.world 0 points 2 weeks ago* (last edited 2 weeks ago)

Everyone thinks they know.

But the autopilot on an aircraft or ship is often just a cruise control, maintaining a constant heading, speed, and (for aircraft) altitude. The pilot or skipper remains 100% responsible for course changes and collision avoidance.