this post was submitted on 28 Apr 2024
289 points (99.3% liked)

Technology

59219 readers
2791 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Without paywall: https://archive.ph/NGkbf

top 50 comments
sorted by: hot top controversial new old
[–] rsuri@lemmy.world 93 points 6 months ago (14 children)

Autopilot “is not a self-driving technology and does not replace the driver,” Tesla said in response to a 2020 case filed in Florida. “The driver can and must still brake, accelerate and steer just as if the system is not engaged.”

Tesla's terminology is so confusing. If "Autopilot" isn't self-driving technology, does that mean it's different from "Full Self Driving"? And if so, is "Full Self Driving" also not a self-driving technology?

[–] Buffalox@lemmy.world 66 points 6 months ago* (last edited 6 months ago) (4 children)

I heard Elon Musk call it: "Assisted full self driving". Which doesn't make any sense. LOL

[–] ChaoticEntropy@feddit.uk 46 points 6 months ago (1 children)

"It's called whatever will make the stock price go up."

[–] T00l_shed@lemmy.world 17 points 6 months ago (1 children)
load more comments (1 replies)
[–] baggins@lemmy.ca 6 points 6 months ago (1 children)

The self in this equation is you. You're driving your self around. Full self driving 😉

load more comments (1 replies)
load more comments (2 replies)
[–] anlumo@lemmy.world 10 points 6 months ago* (last edited 6 months ago) (5 children)

The term autopilot comes from aviation, where the only kind of problem resolution an autopilot does is turning itself off.

Other than that, it just flies from checkpoint to checkpoint.

[–] machinin@lemmy.world 4 points 6 months ago (1 children)

If only we could implement similar testing protocols to the aviation version to validate it's safety!

[–] anlumo@lemmy.world 7 points 6 months ago

A full NTSB investigation for every single crash? I'm all for it!

load more comments (4 replies)
load more comments (12 replies)
[–] istanbullu@lemmy.ml 55 points 6 months ago (1 children)

You can't call something Full Self Driving or Autopilot and then blame the driver. If you want to blame the driver then call it drive asist.

[–] KingThrillgore@lemmy.ml 23 points 6 months ago (1 children)

Right! That's why you have the FSD turn it over to the driver the moment a crash is unavoidable to make the driver liable.

[–] pyre@lemmy.world 7 points 6 months ago

"at the time of the crash, the driver was in full control"

(but not a couple seconds before)

[–] 0x0@programming.dev 21 points 6 months ago* (last edited 6 months ago) (4 children)

I think Tesla should rename Auto Pilot to Darwin Award Mode.

And improve motorcycle detection as well as use LIDAR.

[–] machinin@lemmy.world 14 points 6 months ago (2 children)

It's not that Teslas are killing their owners. Teslas are killing first responders to road accidents, kids getting off buses and motorcyclists. We're all exposed to the problems caused by Musk cutting out testing to save some money.

[–] Threeme2189@lemmy.world 7 points 6 months ago (1 children)

The customers pay extra in order to be beta testers. Best deal ever!

[–] laurelraven@lemmy.blahaj.zone 5 points 6 months ago

And pay more than my first two cars cost me combined at that

load more comments (1 replies)
[–] billwashere@lemmy.world 9 points 6 months ago

I like calling it cruise control with extra fatalities.

[–] laurelraven@lemmy.blahaj.zone 8 points 6 months ago

Heck, even using the same sonar/radar/whatever normal cars use other than just cameras would be a huge improvement

load more comments (1 replies)
[–] Mango@lemmy.world 15 points 6 months ago (3 children)

You're also responsible for what you do when you're drunk! Guess what. You cannot purchase ethical excuses. That's YOUR Tesla. You own it. You're in charge of it regardless of whether or not Tesla makes it impossible to access the controls.

Buyer beware. Stop buying proprietary garbage, ya idiots.

[–] jabjoe@feddit.uk 5 points 6 months ago (6 children)

Unfortunately there is no car that isn't proprietary and even ones without "auto pilot" have things like collision detection that can slam on the breaks for you.

load more comments (6 replies)
load more comments (2 replies)
[–] autotldr@lemmings.world 8 points 6 months ago (1 children)

This is the best summary I could come up with:


SAN FRANCISCO — As CEO Elon Musk stakes the future of Tesla on autonomous driving, lawyers from California to Florida are picking apart the company’s most common driver assistance technology in painstaking detail, arguing that Autopilot is not safe for widespread use by the public.

Evidence emerging in the cases — including dash-cam video obtained by The Washington Post — offers sometimes-shocking details: In Phoenix, a woman allegedly relying on Autopilot plows into a disabled car and is then struck and killed by another vehicle after exiting her Tesla.

Late Thursday, the National Highway Traffic Safety Administration launched a new review of Autopilot, signaling concern that a December recall failed to significantly improve misuse of the technology and that drivers are misled into thinking the “automation has greater capabilities than it does.”

The company’s decision to settle with Huang’s family — along with a ruling from a Florida judge concluding that Tesla had “knowledge” that its technology was “flawed” under certain conditions — is giving fresh momentum to cases once seen as long shots, legal experts said.

In Riverside, Calif., last year, a jury heard the case of Micah Lee, 37, who was allegedly using Autopilot when his Tesla Model 3 suddenly veered off the highway at 65 mph, crashed into a palm tree and burst into flames.

Last year, Florida Circuit Judge Reid Scott upheld a plaintiff’s request to seek punitive damages in a case concerning a fatal crash in Delray Beach, Fla., in 2019 when Jeremy Banner and his Tesla in Autopilot failed to register a semi truck crossing its path.


The original article contains 1,850 words, the summary contains 263 words. Saved 86%. I'm a bot and I'm open source!

[–] NeoNachtwaechter@lemmy.world 20 points 6 months ago (10 children)

Even when the driver is fully responsible, the assistance software must work properly in all situations. And it must be tested fully.

In case the software makes severe mistakes surprisingly, normal drivers maybe don't have a chance to regain control. Normal drivers are not like educated test drivers.

load more comments (10 replies)
load more comments
view more: next ›