this post was submitted on 03 Sep 2023
194 points (88.8% liked)

Technology

59219 readers
3230 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] skymtf@lemmy.blahaj.zone 99 points 1 year ago (9 children)

I feel like the NTSB need to draft a min spec for self driving cars and a testing course that involves some of the worst circtimstances to get approved. I feel like all self driving cars should have to have lidar, and other sensors. Computer vision really isn't working out.

[–] echo64@lemmy.world 57 points 1 year ago (4 children)

You build a benchmark and tesla will train on that benchmark, says nothing about real world use but gets them signed off.

But yes western society is currently in a hellscape of refusing to do even basic regulation of any new technology so it'll probably be a good 20 years of murder robots on the streets before anything gets written down.

[–] FoxBJK@midwest.social 39 points 1 year ago (1 children)

By “western society” do you mean the US? Because the EU doesn’t seem to have any qualms about regulating new technologies. That seems to be a uniquely American thing.

[–] DarthBueller@lemmy.world 7 points 1 year ago

Which somehow means that Europeans suddenly have headlights that makes sense while we’re over here dying from aftermarket HIDs that should be treated like the VA Highway Patrol treats radar detectors ( rip ‘em out and smash them with a sledgehammer on the side of the road)

[–] ayaya@lemdro.id 17 points 1 year ago* (last edited 1 year ago) (1 children)

To be fair we already have giant metal murder boxes zooming around on the streets. If AI kills even a single person everyone flips out even though over 40,000 people die every year in the US from car accidents. And that is just the deaths, not including injuries. Yet I don't really see anyone calling for more regulations on driving tests for humans.

People want AI to somehow be perfect when in reality as long as AI is even 1% better than humans that's saving over 400 lives per year. AI doesn't get sleepy, distracted, drunk, etc. so it probably already is at least 1% better in most situations. Humans are horrible drivers.

load more comments (1 replies)

But yes western society is currently in a hellscape of refusing to do even basic regulation

US regulations are only written in blood or money. the united states was built on the backs of slaves, and then wage-slaves. literal graveyards filled with workers.

im not disagreeing with you, i just found this comically disparate to history... ie, its always been a regulation hellscape.

[–] NeoNachtwaechter@lemmy.world 5 points 1 year ago (2 children)

But yes western society is currently in a hellscape of refusing to do even basic regulation

Only the Usamerican country.

We Europeans are scratching our heads already for very long: why are they letting these guys do just everything they want?

[–] echo64@lemmy.world 3 points 1 year ago

Not really. The eu does more than most western nations, but it's generally things that get regulated ten years too late and only a tiny amount compared to what society actually needs. So again, better, massively lax compared to need and comparisons to other periods

load more comments (1 replies)
[–] nxfsi@lemmy.world 36 points 1 year ago (2 children)

I don't think mandating lidar specifically by name is right, seeing as computer vision is definitely a software problem. Instead they should mandate some method to detect objects in any light condition + a performance standard, which in practice during certification could mean lidar. Regulations should be as minimal and specific as possible.

[–] GenderNeutralBro@lemmy.sdf.org 26 points 1 year ago

Good point. Mandate the ends rather than the means. If they get better functionality with some new tech in a few years, we don't want outdated regulations holding the industry back.

[–] NeoNachtwaechter@lemmy.world 6 points 1 year ago* (last edited 1 year ago) (1 children)

computer vision is definitely a software problem.

No, it isn't.

If it were only software, don't you think Tesla should be the best of them all, being the pure software shop they are?

But it is a real world problem. Recognizing real objects in real world conditions like weather, natural and artificial lights, temperatures (want some ice on your camera?), winds & storms, all kinds of unforeseen circumstances, other bad drivers, police and firemen...

And that's why that pure software shop is so bad at it, while all the real carmakers shrug... they are used to it since forever.

[–] zurohki@aussie.zone 3 points 1 year ago (3 children)

You can be the best in the world and still not be good enough.

Driving a car around using a dozen cameras pointing in every direction isn't something that's fundamentally impossible. We just can't do it yet.

[–] CmdrShepard@lemmy.one 4 points 1 year ago (1 children)

And don't forget vision is what humans use for navigation as well.

And a lot of them are not good at it

load more comments (2 replies)
[–] SuperSleuth@lemm.ee 11 points 1 year ago (9 children)

Should a self-driving car face more rigorous tests than actual human drivers? Honest question.

[–] IphtashuFitz@lemmy.world 14 points 1 year ago (2 children)

Yes. A human brain can handle edge cases it’s never encountered before. Can a self driving car?

  • Ever stop at a red light only to have a police officer wave you through?

  • Ever encounter a car driving the wrong way down a one way street?

  • Ever come across a flooded out stretch of road? (if the road has no lines and the water is still it can be very deceptive looking)

These are a tiny number of things I’ve encountered over the past few years. I’m sure plenty of other drivers can provide other good examples. I’d want to know how a self driving car would handle itself in situations like these.

load more comments (2 replies)
[–] FoxBJK@midwest.social 13 points 1 year ago (1 children)

Human drivers should be facing more rigorous testing regardless. It’s horrifically easy to get a license… and then they never test you again for the rest of your life. That’s just insane when you think about it. My test was in 2002. Feels like I should have to retake it at some point.

[–] TenderfootGungi@lemmy.world 5 points 1 year ago

And take them away for bad driving. But we don’t because our entire transportation infrastructure, outside of a few cities namely NY, is built around everyone driving a car.

[–] snooggums@kbin.social 8 points 1 year ago

Yes because each person must learn on their own and have limited experience relative to the general public as a whole.

Self driving cars can 'learn' from all self driving cars and don't get tired, forget, or anything like that. While they shouldn't be held to perfection, they should absolutely be held to a higher standard than a human.

[–] NeoNachtwaechter@lemmy.world 4 points 1 year ago

Should a self-driving car face more rigorous tests than actual human drivers? Honest question

First: none of these automated cars would pass a German driver's license test. By far.

Second: of course you cannot compare tests for humans with tests for machines.

load more comments (5 replies)
[–] Cheers@sh.itjust.works 5 points 1 year ago

Throw I some pot holes and child pedestrian crossing the street, etc and they'd even come out with a powerful marketing ad.

[–] tony@lemmy.hoyle.me.uk 3 points 1 year ago

Pretty much what the UNECE did.. there are standards for these things. Tesla doesn't meet them, which is why FSD 'beta' is still 'seeking regulatory approval' in the rest of the world.

load more comments (4 replies)
[–] madcaesar@lemmy.world 76 points 1 year ago (5 children)

When I found out teslas don't have LiDAR I nearly shat myself.

[–] jonne@infosec.pub 39 points 1 year ago

Yeah, fun stuff happens when the AI tries to interpret vision of sunlight shining straight into the lens.

[–] TigrisMorte@kbin.social 20 points 1 year ago (2 children)

Don't you know those things cost money!?

Sadly cost cutting MBAs seldom concern themselves with silly things like function or necessity.

[–] Mamertine@lemmy.world 23 points 1 year ago (1 children)

That wasn't an educated MBA who cut them it was a stupid CEO who felt it was an unnecessary crutch (his words).

load more comments (1 replies)
[–] dinckelman@lemmy.world 18 points 1 year ago (1 children)

Have you seen how much these cars cost? For that amount of money it should personally serve me breakfast in bed, let alone having a scanner

load more comments (1 replies)
[–] wagoner@infosec.pub 12 points 1 year ago (1 children)

They did but then Musk had the genius idea to stop installing them. I still have it in my older model but they changed the software not to use them anymore. Like I said, genius....

[–] notfromhere@lemmy.one 5 points 1 year ago (1 children)

Which car do you have that has LiDAR in it? Maybe you’re thinking of RADAR which is different.

[–] wagoner@infosec.pub 3 points 1 year ago

Oh, you're right, it's radar, sorry.

load more comments (2 replies)
[–] fluxion@lemmy.world 72 points 1 year ago (1 children)

Still pushing full self driving even while they are dodging lawsuits by claiming it's just highway cruise control that customers are abusing

[–] CmdrShepard@lemmy.one 13 points 1 year ago

People commonly confuse Autopilot and FSD beta. One is the advanced cruise control and comes on all models while the other is supposed to be autonomous driving and costs $15k extra.

[–] arefx@lemmy.ml 58 points 1 year ago (3 children)

Yeah I'm not getting in a car driven by AI. The tech ain't where it needs to be for that.

[–] CoderKat@lemm.ee 28 points 1 year ago

It needs to be regulated to hold manufacturers responsible when their software isn't good enough. My understanding is that there already probably is enough regulation and government agencies just need to hold Tesla accountable.

Personally, I'm all for cars driven by AI iff it's better and safer than a human driver. Human drivers make a lot of mistakes and driving is the most dangerous everyday activity many people do. But if the AI isn't better than a human, that's a problem. I don't need AI drivers to be flawless, as that's an unrealistic bar. I just need them to be undeniably better than humans. Everything I'm hearing about Tesla's self driving is that they aren't.

[–] DSX@lemm.ee 18 points 1 year ago (1 children)

Especially Tesla. I am very into computer vision research but I would never trust a vehicle that relies on only that with 0 LIDAR or other sensing technologies in place.

load more comments (1 replies)
[–] mriguy@lemmy.world 14 points 1 year ago (1 children)

Unfortunately you can still be hit by cars that idiots let the AI drive.

load more comments (1 replies)
[–] uriel238@lemmy.blahaj.zone 37 points 1 year ago (9 children)

When a service is willing to take responsibility for collisions and driving violations, then we know it works. If the guy asleep at the wheel (which he allegedly can do in an autonomous car) is still the one held responsible, then were not there yet.

That said end-to-end AI totally sounds like equivocal marketing buzz.

[–] danhab99@programming.dev 3 points 1 year ago (1 children)

When a service is willing to take responsibility for collisions and driving violations

Devil's advocate: it's kinda hard to pin the responsibility on Tesla when at the end of the day there was a person driving and the driver's always responsible.

I'm not disagreeing with you, I'm on team ban-human-drivers

load more comments (1 replies)
load more comments (8 replies)
[–] agitatedpotato@lemmy.dbzer0.com 28 points 1 year ago (1 children)

You wouldn't let chatGPT drive a car would you?

[–] gens@programming.dev 7 points 1 year ago

DriveGPT. Written by chatGPT proompted by chatGPT. Powered ny Nvidia^tm©®.

[–] 1bluepixel@lemmy.world 25 points 1 year ago (2 children)

Gonna take the concept of AI hallucinations to a whole new level.

[–] douglasg14b@lemmy.world 4 points 1 year ago

You mean LLM hallucinations?

Typical ML models are not hallucinating in the same manner.

[–] chimasterflex@lemmy.world 3 points 1 year ago

"There was a ghost! This is ectoplasm!"

[–] twhite@lemmy.ml 22 points 1 year ago* (last edited 1 year ago) (1 children)

Not a fan of Tesla or Musk, but can we differentiate the broad public understanding of the term AI from machine learned control systems? People anthropomorphize the situation into thinking there is an I, Robot style driver enough as it is.

Counterpoint, though, maybe doing so encourages skepticism of Tesla's capabilities.

[–] ThatWeirdGuy1001@lemmy.world 6 points 1 year ago

It's not AI it's VI.

Virtual intelligence is not artificial intelligence.

[–] BeautifulMind@lemmy.world 13 points 1 year ago* (last edited 1 year ago)

As a cyclist I really do look forward to the day where good AI is consistently better than the average-to-worst drivers out there; the bar is depressingly low and the stakes are high.

I write (and test) software for a living and my experience with Tesla as a consumer device is that it's many generations away from being something I would trust.

Also, I've seen what happens to product quality when management overrides its engineers in the way elon does- we get pre-alpha quality out there in the wild, being tested on a public that didn't sign up for that shit

[–] Chozo@kbin.social 8 points 1 year ago

That title and thumbnail are pure poetry.

[–] 1984@lemmy.today 3 points 1 year ago

Felon Must speaks again.

load more comments
view more: next ›