this post was submitted on 19 Mar 2025
1504 points (98.3% liked)

Not The Onion

15364 readers
1001 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

In the piece — titled "Can You Fool a Self Driving Car?" — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

top 50 comments
sorted by: hot top controversial new old
[–] comfy@lemmy.ml 141 points 6 days ago (9 children)

I hope some of you actually skimmed the article and got to the "disengaging" part.

As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

It's a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

[–] endeavor@sopuli.xyz 40 points 5 days ago* (last edited 5 days ago)

It’s a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

That is like writing musk made an awkward, confused gesture during a time a few people might call questionable timing and place.

[–] cortex7979@lemm.ee 33 points 5 days ago

That's so wrong holy shit

[–] LemmyFeed@lemmy.dbzer0.com 28 points 6 days ago (7 children)

Don't get me wrong, autopilot turning itself off right before a crash is sus and I wouldn't put it past Tesla to do something like that (I mean come on, why don't they use lidar) but maybe it's so the car doesn't try to power the wheels or something after impact which could potentially worsen the event.

On the other hand, they're POS cars and the autopilot probably just shuts off cause of poor assembly, standards, and design resulting from cutting corners.

[–] FiskFisk33@startrek.website 32 points 5 days ago (7 children)

if it can actually sense a crash is imminent, why wouldn't it be programmed to slam the brakes instead of just turning off?

Do they have a problem with false positives?

load more comments (7 replies)
load more comments (6 replies)
load more comments (6 replies)
[–] madcaesar@lemmy.world 162 points 6 days ago (59 children)

My 500$ robot vacuum has LiDAR, meanwhile these 50k pieces of shit don't 😂

[–] rbm4444@lemmy.world 32 points 6 days ago

Holy shit, I knew I'd heard this word before. My Chinese robot vacuum cleaner has more technology than a tesla hahahahaha

load more comments (58 replies)
[–] FuglyDuck@lemmy.world 289 points 6 days ago (25 children)

As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

This has been known.

They do it so they can evade liability for the crash.

[–] bazzzzzzz@lemm.ee 46 points 6 days ago (3 children)

Not sure how that helps in evading liability.

Every Tesla driver would need super human reaction speeds to respond in 17 frames, 680ms(I didn't check the recording framerate, but 25fps is the slowest reasonable), less than a second.

[–] FuglyDuck@lemmy.world 67 points 6 days ago

It’s not likely to work, but them swapping to human control after it determined a crash is going to happen isn’t accidental.

Anything they can do to mire the proceedings they will do. It’s like how corporations file stupid junk motions to force plaintiffs to give up.

[–] orcrist@lemm.ee 52 points 6 days ago (5 children)

They're talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.

And then that creates a discussion about how much time the human driver has to have in order to actually solve the problem, or gray areas about who exactly controls what when, and it complicates the situation enough where maybe Tesla can pay less money for the deaths that they are obviously responsible for.

load more comments (5 replies)
load more comments (1 replies)
[–] sober_monk@lemmy.world 29 points 6 days ago

The self-driving equivalent of "Jesus take the wheel!"

[–] fibojoly@sh.itjust.works 29 points 6 days ago* (last edited 6 days ago) (2 children)

That makes so little sense... It detects it's about to crash then gives up and lets you sort it?
That's like the opposite of my Audi who does detect I'm about to hit something and gives me either a warning or just actively hits the brakes if I don't have time to handle it.
If this is true, this is so fucking evil it's kinda amazing it could have reached anywhere near prod.

[–] Red_October@lemmy.world 27 points 6 days ago

The point is that they can say "Autopilot wasn't active during the crash." They can leave out that autopilot was active right up until the moment before, or that autopilot directly contributed to it. They're just purely leaning into the technical truth that it wasn't on during the crash. Whether it's a courtroom defense or their own next published set of data, "Autopilot was not active during any recorded Tesla crashes."

load more comments (1 replies)
load more comments (22 replies)
[–] eugenevdebs@lemmy.dbzer0.com 130 points 6 days ago (5 children)

"Dipshit Nazis mad at facts bursting their bubble is unreality" is another way of reading this headline.

load more comments (5 replies)
[–] ABetterTomorrow@lemm.ee 20 points 5 days ago (6 children)

I can’t wait for all this brand loyalty and fan people culture to end. Why is this even a thing? Like talking about box office results, companies financials and stocks…. If you’re not an investor of theirs, just stop. It sounds like you’re working for free for them.

load more comments (6 replies)
[–] buddascrayon@lemmy.world 46 points 6 days ago (5 children)

It's a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

So, who's the YouTuber that's gonna test this out? Since Elmo has pushed his way into the government in order to quash any investigation into it.

load more comments (5 replies)
[–] mrodri89@lemmy.zip 61 points 6 days ago (3 children)

If you get any strong emotions on material shit when someone makes a video...you have 0 of my respect. Period.

load more comments (3 replies)
[–] Critical_Thinker@lemm.ee 15 points 5 days ago (3 children)

Of course it disengages self driving modes before an impact. Why would they want to be liable for absolutely anything?

load more comments (3 replies)
[–] Banana@sh.itjust.works 41 points 6 days ago (22 children)

And the president is driving one of these?

Maybe we should be purchasing lots of paint and cement blockades...

[–] LeninOnAPrayer@lemm.ee 22 points 6 days ago* (last edited 6 days ago)

When he was in the Tesla asking if he should go for a ride I was screaming "Yes! Yes Mr. President! Please! Elon, show him full self driving on the interstate! Show him full self driving mode!"

load more comments (21 replies)
[–] yarr@feddit.nl 18 points 5 days ago (5 children)

Does anyone else get the heebies with Mark Rober? There's something a little off about his smile and overall presence.

[–] FurryMemesAccount@lemmy.blahaj.zone 20 points 5 days ago* (last edited 5 days ago) (1 children)

Yeah, he's over-positive, it's unnerving.

Still, that video is good anti-musk press.

[–] Soleos@lemmy.world 10 points 5 days ago

The hyper-positivity and enthusiasm is because his content is aimed at kids as much as it is adults. A lot of kid-oriented science content I remember, from tv shows/documentaries to guest speakers, to science-centre guides had that affect.

[–] JokklMaster@lemmy.world 15 points 5 days ago* (last edited 5 days ago)

I believe he's one of the very many YouTubers who's a Mormon.

Edit: https://youtu.be/3Bcn0TFAi6E

[–] boaratio@lemmy.world 15 points 5 days ago

Did you know he used to work at NASA? He very rarely mentions it. /s

load more comments (2 replies)
[–] Cantaloupe877@lemmy.world 42 points 6 days ago (3 children)
load more comments (3 replies)
[–] get_the_reference_@midwest.social 25 points 6 days ago (4 children)

E. Lon Musk. Supah. Geenius.

load more comments (4 replies)
[–] wabafee@lemmy.world 60 points 6 days ago* (last edited 6 days ago) (16 children)

I bet the reason why he does not want the LiDAR in the car really cause it looks ugly aestheticly.

[–] AngryCommieKender@lemmy.world 91 points 6 days ago* (last edited 6 days ago) (14 children)

It costs too much. It's also why you have to worry about panels falling off the swastitruck if you park next to them. They also apparently lack any sort of rollover frame.

He doesn't want to pay for anything, including NHTSB crash tests.

It's literally what Drumpf would have created if he owned a car company. Cut all costs, disregard all regulations, and make the public the alpha testers.

load more comments (14 replies)
load more comments (15 replies)
[–] Ulrich_the_Old@lemmy.ca 30 points 6 days ago

If you own a tesla or a cybertruck you deserve it.

[–] pineapplelover@lemm.ee 24 points 6 days ago* (last edited 5 days ago) (11 children)

To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well. That being said, tesla shouldn't rely on cameras

Edit: having just watched the video, that was a very obvious fake wall. You can see the outlines of it pretty well. I'm also surprised it failed other tests when not on autopilot, seems pretty fucking dangerous.

[–] FuglyDuck@lemmy.world 29 points 6 days ago* (last edited 6 days ago) (3 children)

To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well.

this isn't being fair. It's being compared to the other- better- autopilot systems that use both LIDAR and radar in addition to daylight and infrared optical to sense the world around them.

Teslas only use daylight and infrared. LIDAR and radar systems both would not have been deceived.

load more comments (3 replies)
[–] echodot@feddit.uk 14 points 5 days ago (2 children)

Watch the video it's extremely obvious to a human driver that there is something wrong with that view ahead. It's even pointed out in the video that humans use additional visual clues when a situation is ambiguous.

The cars don't have deduction and reasoning capabilities so they need additional sensors to give them more information to compensate due to their lack of brains. So it's not really sensible to compare self-driving systems to humans. Humans have limited sensory input but it's compensated for by reasoning abilities, Self-Driving cars do not have reasoning abilities but it's compensated for by enhanced sensory input.

load more comments (2 replies)
load more comments (9 replies)
load more comments