@LittleLordLimerick
@lemm.eeI don't think we should prosecute the programmer, but I do think the manufacturing company should be liable.
I believe the manufacturer should be liable for damage caused by their product due to manufacturing defects and faulty software. This incentivizes manufacturers to make the safest product possible to reduct their liability. If it turns out that it's not possible for manufacturers to make these cars safe enough to be profitable, then so be it.
I am not saying we should exempt autonomous vehicle manufacturers from regulation. I'm actually saying the opposite: that we need to base any decision on a rigorous analysis of safety data for these vehicles, which means the manufacturers should be required to provide said data to regulatory agencies.
So then you've just circled back around to what I originally said: is it actually true that you're at more risk near a Tesla than you are near a human driver? Do you have any evidence for this assertion? Random anecdotes about a Tesla running a light don't mean anything because humans also run red lights all the time. Human drivers are a constant unknown. I have never and will never trust a human driver.
I'll be honest here, I hate cars and the car-centered culture of the USA. I care way more about the victims of bad/careless/drunk/distracted drivers than I do about the bad/careless/drunk/distracted drivers themselves.
If me being in a self-driving car means other people around me are more safe, then it's not even a question.
Your concern seems to be for the pilot of the car that causes the accident. What about the victims? They don't care if the car was being driven by a person or a computer, only that they were struck by it.
A car is a giant metal death machine, and by choosing to drive one, you are responsible not only for yourself, but also the people around you. If self-driving cars can substantially reduce the number of victims, then as a potential victim, I don't care if you feel safer as the driver. I want to feel less threatened by the cars around me.
That's not the argument for self-driving cars at all. The argument for self-driving cars is that people hate driving because it's a huge stressful time sink. An additional benefit of self-driving cars is that computers have better reaction times than humans and don't stare at a phone screen while flying down the freeway at 70 mph.
If we find that SDC get in, say, 50% fewer serious accidents per 100 miles than human drivers, that would mean tens of thousands fewer deaths and hundreds of thousands fewer injuries. Your objection to that is that it's not good enough because you demand zero serious accidents? That's preposterous.
When we're talking about public safety, it should be entirely about statistics. Basing public safety policy on feelings and emotions is how you get 3 hour long TSA checkpoints at airports to prevent exactly zero attempted hijackings in the last 22 years.
That was true 20 years ago. Things evolve. No one wants to download and install ten million individual apps for every single thing they do on the internet.
So I hate Elon Musk and I think Tesla is way overhyped, but I do want to point out that singular anecdotes like this don't mean anything.
Human drivers run red lights and crash cars all the time. It's not a question of whether a self-driving car runs a light or gets in a crash, it's whether they do it more often than a human driver. What are the statistics for red lights run per mile driven?