this post was submitted on 01 Nov 2025
505 points (97.4% liked)
Not The Onion
18514 readers
1078 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
At some point we have to accept vehicular deaths given how car-centric our society is and how distracted and unsafe a lot of drivers have become.
Normal taxi drivers kill people.
Normal truck drivers kill people.
Normal home to work drivers kill people.
If a robotic taxi can lower the taxi category of accidents by 91% across the board, including death rates, then that's a positive improvement to society any way you slice it. Not saying it isn't a horrifying dystopian world we're potentially building, but at the moment, given the numbers, it would be 91% safer in that category.
The ultimate solution is to shift towards more public transit options in general, and away from individual vehicular transport. Not only is it a massive burden to the environment, but it's a massive cost burden to the individuals and society as a whole.
I agree, the consequences should be severe.
With that said, airlines kill people and all it chiefly results in is a fine to act as a disbursement to the families.
You need to prove this number. Looking at the behavior of current driverless cars, the software is still shit, and nothing has reached Level 5 Autonomous Driving. There are too many edge cases, and conflicting behavior points. Navigating a world of humans driving in different ways with complex urban and rural streets is a very very messy affair.
Hell, nobody in the space can even answer this simple question correctly: If the speed limit is 55 MPH on the highway, and everybody is going 65 MPH, and we know that the delta of speed is what kills people in highway car accidents, what speed does the driverless car use?
(Hint: the correct answer is not 55.)
i think people are much worse drivers than you think they are… you just hear about every self driving accident because it’s newsworthy right now
apparently
https://financebuzz.com/self-driving-car-statistics-2025
not a primary source, but their data seems to be from the NHSTA
yeah… very much public health attitude
The "if" in this sentence is a load bearing word.
With today's crew running the policy, I don't think anyone will prevent corporations from unleashing completely unsafe robotic taxis on the public that'll perform well worse than regular ones. I really wish people would stop making this argument to the corporation's benefit until we have some data backing it up.
I get that there's a theoretical possibility that still imperfect robotic taxis could outperform humans, but that's just theoretical.
With the way corporate accountability is handled (i.e., corporations aren't held accountable) nowadays, I just don't see robotic taxis as much more than an accountability sink and at this point I'd prefer taking regular taxis because at least there is someone to fucking hold accountable when things go wrong.
also, who's getting the most injured? pedestrians, or occupants?
if the net rate of injuries increases among a vulnerable group, that is not okay
Watch “Upload” on prime. Literally about this.
Except there's a difference between a machine killing somebody because it was programmed to and a person killing somebody on accident. One of those things has people making decisions who are not going to be held responsible.
The other problem is it creates openings for malicious actors: if your government (or even Saudi Arabia, or Israel) for instance wanted to kill a political dissident they could add a self erasing line of code to a car to run over a specific person.
This is why self driving laws need to be explicit about how they're approaching this otherwise you're inviting in a lot of suspicious behavior by amoral companies. There needs to be safeguards on how and who has access to self driving code.
I would say that source code for any self driving or autonomous machine in a public street should be held by insurance companies or a third party who performs regular validation checks on vehicle codes (which could be read and validated at charging stations or gas stations) and it should only be edited by publicly licensed software engineers whose licenses can be revoked for bad behavior.
Anything less is inviting a series of predictable public safety fiascos.