darkmugglet

joined 1 year ago
 

Look, I get it. Docker started the whole movement. But if you're an OSS software vender, do your users a solid: don't use Docker hub for image hosting. Between ghcr.io (GitHub), Quay, and others, there are plenty of free choices that don't have rate limits on users. Unless you want Docker to get subscription, FOSS projects should use places that don't rate linit

[–] darkmugglet@lemm.ee 1 points 1 year ago

You're missing the point -- with a human driver there is accountability. If I, as a human, cause an accident, I have either criminal or civil liability. The question of "who is at fault" get murky. And then you have the fact that Tesla is not obligated to report the crashes. And then the failures of automated driving is very different than human errors.

I don't think anyone is suggesting that we ban autonomous driving. But it needs better oversight and accountability.

[–] darkmugglet@lemm.ee 5 points 1 year ago* (last edited 1 year ago)

IMO, this is the problem. Any normal person doing this would be in prison. Something like automated driving should be strictly regulated. I own a Mach-e, and while its self driving features are limited, it errs so much on the side of caution that you cannot not pay attention to the road. As it should be.

[–] darkmugglet@lemm.ee 3 points 1 year ago (1 children)

For me, the problem is one of justice. If I, as a meat sack, kill someone I am liable and most likely criminally liable for it. When AI commits man slaughter, then what? A company has the financial incentive and very little of the legal exposure because it's out sourced to the owner. Effectively the human operator trusting Evil Corp gets the raw end of the deal.

IMO, each version of the software should get a legal license.

42
submitted 1 year ago* (last edited 1 year ago) by darkmugglet@lemm.ee to c/technology@beehaw.org
 

More or less Tesla's autopilot is not as safe as Tesla would have you believe.