this post was submitted on 30 Oct 2024
647 points (89.1% liked)

Technology

60112 readers
2473 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

OK, its just a deer, but the future is clear. These things are going to start kill people left and right.

How many kids is Elon going to kill before we shut him down? Whats the number of children we're going to allow Elon to murder every year?

top 50 comments
sorted by: hot top controversial new old
[–] Hubi@feddit.org 228 points 1 month ago (54 children)

The poster, who pays Tesla CEO Elon Musk for a subscription to the increasingly far-right social media site, claimed that the FSD software “works awesome” and that a deer in the road is an “edge case.” One might argue that edge cases are actually very important parts of any claimed autonomy suite, given how drivers check out when they feel the car is doing the work, but this owner remains “insanely grateful” to Tesla regardless.

How are these people always such pathetic suckers.

[–] teft@lemmy.world 146 points 1 month ago (5 children)

I grew up in Maine. Deer in the road isn’t an edge case there. It’s more like a nightly occurrence.

[–] spankmonkey@lemmy.world 51 points 1 month ago (3 children)

Same in Kansas. Was in a car that hit one in the 80s and see them often enough that I had to avoid one that was crossing a busy interstste highway last week.

Deer are the opposite of an edge case in the majority of the US.

[–] leftytighty@slrpnk.net 22 points 1 month ago* (last edited 1 month ago) (1 children)

Putting these valid points aside we're also all just taking for granted that the software would have properly identified a human under the same circumstances..... This could very easily have been a much more chilling outcome

load more comments (1 replies)
load more comments (2 replies)
load more comments (4 replies)
[–] leftytighty@slrpnk.net 35 points 1 month ago (1 children)

Being a run of the mill fascist (rather than those in power) is actually an incredibly submissive position, they just want strong daddies to take care of them and make the bad people go away. It takes courage to be a "snowflake liberal" by comparison

load more comments (1 replies)
[–] NeoNachtwaechter@lemmy.world 20 points 1 month ago

Edge cases (NOT features) are the thing that keeps them from reaching higher levels of autonomy. These level differences are like "most circumstances", "nearly all circumstances", "really all circumstances".

Since Tesla cares so much more about features, they will remain on level 2 for another very long time.

load more comments (51 replies)
[–] independantiste@sh.itjust.works 88 points 1 month ago* (last edited 1 month ago) (2 children)

Only keeping the regular cameras was a genius move to hold back their full autonomy plans

[–] cm0002@lemmy.world 50 points 1 month ago (2 children)

The day he said that "ReGULAr CAmErAs aRe ALl YoU NeEd" was the day I lost all trust in their implementation. And I'm someone who's completely ready to turn over all my driving to an autopilot lol

load more comments (2 replies)
load more comments (1 replies)
[–] w3dd1e@lemm.ee 47 points 1 month ago (23 children)

Deer aren’t edge cases. If you are in a rural community or the suburbs, deer are a daily way of life.

As more and more of their forests are destroyed, deer are a daily part of city life. I live in the middle of a large midwestern city; in neighborhood with houses crowded together. I see deer in my lawn regularly.

load more comments (23 replies)
[–] bluGill@fedia.io 39 points 1 month ago (10 children)

Driving is full of edge cases. Humans are also bad drivers who get edge cases wrong all the time.

The real question isn't is Tesla better/worse in anyone in particular, but overall how does Tesla compare. If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it. Is Tesla is overall worse then they shouldn't be driving at all (If they can identify those situations they can stop and make a human take over). If a Tesla is overall better then I'll accept a few edge cases where they are worse.

Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.

[–] spankmonkey@lemmy.world 23 points 1 month ago (3 children)

Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.

https://www.reuters.com/business/autos-transportation/nhtsa-opens-probe-into-24-mln-tesla-vehicles-over-full-self-driving-collisions-2024-10-18/

The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in such conditions.

It sure seems like they aren't being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren't telling the truth.

load more comments (3 replies)
load more comments (9 replies)
[–] Emerald@lemmy.world 38 points 1 month ago* (last edited 1 month ago) (3 children)

I notice nobody has commented on the fact that the driver should've reacted to the deer. It's not Tesla's responsibility to emergency brake, even if that is a feature in the system. Drivers are responsible for their vehicle's movements at the end of the day.

[–] chaogomu@lemmy.world 37 points 1 month ago (5 children)

Then it's not "Full self driving". It's at best lane assistance, but I wouldn't trust that either.

Elon needs to shut the fuck up about self driving and maybe issue a full recall, because he's going to get people killed.

load more comments (5 replies)
[–] rsuri@lemmy.world 22 points 1 month ago

True but if Tesla keeps acting like they're on the verge of an unsupervised, steering wheel-free system...this is more evidence that they're not. I doubt we'll see a cybercab with no controls for the next 10 years if the current tech is still ignoring large, highly predictable objects in the road.

[–] inclementimmigrant@lemmy.world 19 points 1 month ago

That would be lovely if it wasn't called and marketed as Full Self-Driving.

You sell vaporware/incomplete functionality software and release it into the wild, then you are responsible for all the chaos it brings.

[–] JamesStallion@sh.itjust.works 37 points 1 month ago (1 children)

All cars are death machines

load more comments (1 replies)
[–] homesnatch@lemm.ee 32 points 1 month ago (2 children)

I watched the whole video.. Mowed down like 90 deer in a row.

load more comments (2 replies)
[–] sem@lemmy.blahaj.zone 31 points 1 month ago (1 children)

Why does this read like an ad for cybertrucks for people who would want to run over deer

load more comments (1 replies)
[–] Madnessx9@lemmy.world 30 points 1 month ago (6 children)

Full speed in the dark, I think most people would failed to avoid that. What's concerning is it does not stop afterwards

[–] jj4211@lemmy.world 39 points 1 month ago (2 children)

Note that part of the discussion is we shouldn't settle for human limitations when we don't have to. Notably things like LIDAR are considered to give these systems superhuman vision. However, Tesla said 'eyes are good enough for folks, so just cameras'.

The rest of the industry said LIDAR is important and focus on trying to make it more practical.

load more comments (2 replies)
[–] iAvicenna@lemmy.world 22 points 1 month ago (1 children)

Isn't Elon advertising AI as orders of magnitudes better reaction time and much less error prone than a human though...

load more comments (1 replies)
[–] fatalError@lemmy.sdf.org 18 points 1 month ago (7 children)

I think the LIDAR and other sensors are supposed to be IR and see in the dark.

[–] alsimoneau@lemmy.ca 19 points 1 month ago

Sensors that the Tesla famously doesn't have (afaik, didn't check) because Elon is a dumbass.

load more comments (6 replies)
load more comments (3 replies)
[–] Turbonics@lemmy.sdf.org 30 points 1 month ago (3 children)

The autopilot knows deers can't sue

load more comments (3 replies)
[–] blady_blah@lemmy.world 29 points 1 month ago
  1. Vehicle needed lidar
  2. Vehicle should have a collision detection indicator for anomalous collisions and random mechanical problems
[–] NutWrench@lemmy.world 26 points 1 month ago (4 children)

For the 1000th time Tesla: don't call it "autopilot" when it's nothing more than a cruise control that needs constant attention.

load more comments (4 replies)
[–] brbposting@sh.itjust.works 24 points 1 month ago (2 children)

Tesla’s approach to automotive autonomy is a unique one: Rather than using pesky sensors, which cost money, the company has instead decided to rely only on the output from the car’s cameras. Its computers analyze every pixel, crunch through tons of data, and then apparently decide to just plow into deer and keep on trucking.

load more comments (2 replies)
[–] Kbobabob@lemmy.world 24 points 1 month ago (5 children)

Is there video that actually shows it "keeps going"? The way that video loops I know I can't tell what happens immediately after.

load more comments (5 replies)
[–] Gammelfisch@lemmy.world 21 points 1 month ago* (last edited 1 month ago) (4 children)

So, a kid on a bicycle or scooter is an edge case? Fuck the Muskrat and strip him of US citizenship for illegally working in the USA. Another question. WTF was the driver doing?

load more comments (4 replies)
[–] nimble@lemmy.blahaj.zone 21 points 1 month ago (2 children)

Friendly reminder that tesla auto pilot is an AI training on live data. If it hasn't seen something enough times then it won't know to stop. This is how you have a tesla running full speed into an overturned semi and many, many other accidents.

load more comments (2 replies)
[–] Diplomjodler3@lemmy.world 20 points 1 month ago (2 children)

That deer was pushing the woke agenda!

load more comments (2 replies)
[–] iAvicenna@lemmy.world 20 points 1 month ago

AI: %1 chance human, keep going like nothing happened

[–] Sam_Bass@lemmy.world 18 points 1 month ago (3 children)

the deer is not blameless. those bastards will race you to try and cross in front of you.

load more comments (3 replies)
load more comments
view more: next ›