this post was submitted on 20 Dec 2023
689 points (97.8% liked)
Technology
59219 readers
3145 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How dare a company try to work with governments to create a new safety feature!
How is this a safety feature though? Are they saying we have to be extra careful around self-driving cars? If so then the car shouldn't be considered to be self-driving. If not, then what's the use?
I see a lot of people in this thread saying a car that needs any kind of indication of self-driving isn't safe enough to be on the road, but that implies a single answer to questions like "is it safe enough?" In reality, different people will answer that question differently and their answer will change over time. I see it as a good thing to try to accommodate people who view self-driving cars as unsafe even when they are street-legal. So it's not really a safety feature from all perspectives, but it is from the perspective of people who want to be extra cautious around those cars.
Personally I see an argument for self-driving cars that aren't as safe as a average human driver. It's basically the same reason you sometimes see cars with warning signs about student drivers: we wouldn't consider student drivers safe enough to drive except that it's a necessary part of producing safe drivers. Self-driving cars are the same, except that instead of individual drivers, its self-driving technology that we expect to improve and eventually become safer than human drivers.
Another way to to look at it is that there are a lot of drivers who are below-average in their driving safety for a variety of reasons, but we still consider them safe enough to drive. Think of people who are tired, emotional, distracted, ill, etc. It would be nice to have the same warning lights for those drivers, but since that's not practical, having them only for self-driving cars is better than nothing.
Different regulations apply for the driver when the car is autonomous vs controlled by a driver.
These lights do not indicate driving assists like Tesla's autopilot but full level 3 and above autonomy. In level 3 for example, Mercedes is responsible for any damages due to accidents - not the driver.
Also in level 3 the driver may legally use their phone, which is illegal for a car driver normally and give them a ticket.
So there IS a legal requirement to find out about the autonomy level of a car from outside.