How Safe Are Self-Driving Cars After the First Fatal Accident?

  • Thread starter Dr. Courtney
  • Start date
  • Tags
    Car Self
In summary, a self-driving Uber car struck and killed a pedestrian in Arizona. The small experimental installed base of self-driving cars raises concerns with the technology, and the tragedy will be scrutinized like no other autonomous vehicle interaction in the past.
  • #211
StoneTemplePython said:
A much stricter question is whether it is criminally negligent in instructing a 'driver' of a quasi-automonomous vehicle to not look at the road.

I think the case can be made that the entire testing protocol is criminal. In Arizona, one can be charged with up to Murder 2 if "Under circumstances manifesting extreme indifference to human life, a person recklessly engages in conduct which creates a grave risk of death and thereby causes the death of another person."

I hope we can agree that if Uber calculated that the risk of killing someone over the course of their testing was 99%, that is criminal behavior. I also hope that we can agree that if they didn't calculate this risk, began testing anyway, and killed someone, this is also criminal behavior. Given the facts, it is hard to believe that they calculated the risk: it's hard to believe that disabling safety systems and giving the "safety driver" tasks that would preclude being a safety driver could run an acceptable risk.

Waymo is much more forthcoming about their safety data, and can make a compelling case that they have correctly evaluated the risks and that they are acceptable. Uber has not been as forthcoming. Oh, and they plan to resume testing this summer - just not in Arizona.
 
  • Like
Likes russ_watters and Bystander
Physics news on Phys.org
  • #212
Vanadium 50 said:
Waymo is much more forthcoming about their safety data, and can make a compelling case that they have correctly evaluated the risks and that they are acceptable. Uber has not been as forthcoming. Oh, and they plan to resume testing this summer - just not in Arizona.

You are closer to the facts than me on this. It sounds about right -- and concerning.
Vanadium 50 said:
I hope we can agree that if Uber calculated that the risk of killing someone over the course of their testing was 99%, that is criminal behavior. I also hope that we can agree that if they didn't calculate this risk, began testing anyway, and killed someone, this is also criminal behavior. Given the facts, it is hard to believe that they calculated the risk: it's hard to believe that disabling safety systems and giving the "safety driver" tasks that would preclude being a safety driver could run an acceptable risk.

Well here's something that may not have been said before:

your point requires more thought on my end but unfortunately I need to go bowling now. (And watch Rockets game.)

I'll revert tomorrow.
 
  • #213
StoneTemplePython said:
unfortunately I need to go bowling now. (And watch Rockets game.)
Gutterballs on both, sorry. :smile:
 
  • #214
russ_watters said:
In this incident, the accident occurred primarily because the driver wasn't watching the road.
No driver watches the road 100% of the time, so labelling what she was doing a distraction then necessarily implicates each and every other driver as being distracted when they read road signs, check the instrument panel, and do all sort of other things such as talking to passengers or adjusting the climate control. The NYSB consideration that the driver was not watching the road at the time of the incident from the video, and verbal evidence, does not lead to a conclusion that the driver was a primary mover towards the incident, or that the incident could have been avoided had the driver been attentive towards the roadway, in which case she may have decided to not glance left at the moments before the incident.
 
  • #215
russ_watters said:
they are labeled "distracted" because they aren't looking at the road
Distracted from driving which entails some more functions other than looking at the road.

Vanadium 50 said:
Uber has not been as forthcoming
Such is the corporate culture at Uber, or at least they have acquired that reputation of attempting to resist and/or circumvent local laws in the endeavour with the taxi service.
 
  • #216
Vanadium 50 said:
I hope we can agree that if Uber calculated that the risk of killing someone over the course of their testing was 99%, that is criminal behavior.
I disagree. Every car manufacturer knows that over the course of their product lifetime one of their cars will be involved in a fatal accident with basically 100% probability. Does it make producing these cars criminal behavior? Clearly not. But we can make an even stronger statement: There are cars that do not have an automatic braking capability. I'm sure you can calculate that including such a feature will save at least one life with near certainty. Does it make producing cars without automated braking capability a crime? I doubt that.

I see criminal behavior if the risk to kill someone is significantly increased compared to driving normally. A safety driver who mainly looks at a screen combined with a flawed software* might lead to such a situation - but we don't know that. How many fatal accidents would human drivers have made over the course of Uber's test? Do you know that number? I don't.

*the car recognized an object on the street 6 seconds in advance. More than enough time to slow down, especially if the object cannot be identified clearly. The disabled emergency braking was not necessary until 1.3 seconds before the impact.
Vanadium 50 said:
I also hope that we can agree that if they didn't calculate this risk, began testing anyway, and killed someone, this is also criminal behavior.
Do you calculate the risk to kill someone every time you drive with a car? If not, would that automatically mean you behaved criminally if you get involved in a fatal accident?
 
  • Like
Likes StoneTemplePython
  • #217
mfb said:
Do you calculate the risk to kill someone every time you drive with a car? If not, would that automatically mean you behaved criminally if you get involved in a fatal accident?

Now, but the government(s) who issued me my drivers license(s) has. They have evaluated my driving and driving history and determined under specified circumstances the risk is legally acceptable. If I deviate from these circumstances - e.g. turn my headlights off at night - and kill someone, you bet you I can be charged with criminal behavior.

mfb said:
I disagree. Every car manufacturer knows that over the course of their product lifetime one of their cars will be involved in a fatal accident with basically 100% probability. Does it make producing these cars criminal behavior? Clearly not.

I disagree, and the courts are on my side. See State of Indiana v. Ford (1980), where Ford Motor Company was charged criminally because of a cost/benefit analysis they did, leading to the determination that it was cheaper to settle with the estimated 360 victims and their family than to spend the $11 per vehicle to prevent these injuries. It is true that Ford was found not guilty, and in the US jurors do not have to explain their reasoning, but post-trial interviews suggest that they were swayed by Ford's attempts to recall the vehicles before the collision in question and defense claims that this particular accident was unsurvivable in any vehicle.

Under Indiana, Uber can absolutely be charged. Will they be convicted? Nobody can predict a jury, but the elements that apparently led to a not guilty verdict there seem to be absent here.
 
  • #218
Vanadium 50 said:
Now, but the government(s) who issued me my drivers license(s) has. They have evaluated my driving and driving history and determined under specified circumstances the risk is legally acceptable. If I deviate from these circumstances - e.g. turn my headlights off at night - and kill someone, you bet you I can be charged with criminal behavior.
Did Uber violate government regulations?
Vanadium 50 said:
I disagree, and the courts are on my side. See State of Indiana v. Ford (1980), where Ford Motor Company was charged criminally because of a cost/benefit analysis they did, leading to the determination that it was cheaper to settle with the estimated 360 victims and their family than to spend the $11 per vehicle to prevent these injuries. It is true that Ford was found not guilty, and in the US jurors do not have to explain their reasoning, but post-trial interviews suggest that they were swayed by Ford's attempts to recall the vehicles before the collision in question and defense claims that this particular accident was unsurvivable in any vehicle.
Even if Ford would have lost (and it didn't), I don't see how this would be relevant here. Cars kill people. The more cars you make the more people will be killed by these cars. That is an obvious truth. "There is a high chance it will be involved in at least one fatal accident" on its own is not argument on its own. You have to consider the relative rate.
 
  • Like
Likes StoneTemplePython
  • #219
mfb said:
relative rate.
Damned high.
 
  • #220
Vanadium 50 said:
I hope we can agree that if Uber calculated that the risk of killing someone over the course of their testing was 99%, that is criminal behavior. I also hope that we can agree that if they didn't calculate this risk, began testing anyway, and killed someone, this is also criminal behavior. Given the facts, it is hard to believe that they calculated the risk: it's hard to believe that disabling safety systems and giving the "safety driver" tasks that would preclude being a safety driver could run an acceptable risk.

I was thinking about this a bit yesterday. I believe @mfb spotted the flaw here.

I find the statement troubling on basic qualitative grounds. For a sufficiently large amount of time / trials, we can get the probability of a fatality arbitrarily close to 1 assuming (loose) independence and a probability of failure in the kth 'trial' given by ##p_k## where all failure probabilities have a hard bound greater than zero. (The conventional case is to assume iid trials and simplifies things a lot).

Put differently, your statement here is scale invariant (with respect to amount of 'trials' or 'car hours' or 'car miles'), so it can't be meaningful. Working backwards from here we need a ruler to measure 'too dangerous' or reckless by, and that is given by probabilities of regular people causing fatalities 'on accident' while driving (per trial or 'car hour' or 'car miles').

The public may have different tolerances for fatalities by self driving cars vs people driving cars. All the more reason to get the rules, guidelines, safe harbor provisions, etc. on the books via the legislative process.
 
Last edited:
  • #221
Another interesting point is the question how to evaluate the risk of self-driving cars without self-driving cars on actual roads. Sure, you can simulate millions of scenarios, but so far it looks like the accidents come from the unknown unknowns - scenarios not expected in the programming, and correspondingly scenarios a simulation could miss.
I don't think that is a reason to ban all tests. When someone gets a new driver's license the government doesn't have an accurate estimate for the risk from this driver either. The tests done with driverless systems are much more thorough than a 30 minutes test drive.
 
  • #222
mfb said:
You have to consider the relative rate.
Even if the relative rate will be lower for self-driving cars, I think there is a major flaw there what will drive people crazy against such vehicles: there isn't a soul there to blame if something goes wrong.

As I see, to find somebody to blame is quite frequently more important than the lower probability of error. People just don't goes well with this kind of helplessness.
 
  • #223
We have many types of accidents that are not clearly the fault of some person. I don't think one more type will be a big issue.
 
  • #224
Rive said:
Even if the relative rate will be lower for self-driving cars, I think there is a major flaw there what will drive people crazy against such vehicles: there isn't a soul there to blame if something goes wrong.

Once the bugs are worked out insurance companies will probably lower your rates if you own a self driving car?
 
  • #225
CWatters said:
Nsaspook... I think that video answers my question.

Looks like it just followed the brightest white line.

Another accident where auto-pilot seems to have followed the white line.
https://twitter.com/LBPD_PIO_45/sta...shes-into-laguna-beach-police-patrol-vehicle/

DeY-CmCU8AACMSR.jpg


DeYwq9aV0AA-0kV.jpg
 

Attachments

  • DeY-CmCU8AACMSR.jpg
    DeY-CmCU8AACMSR.jpg
    67.5 KB · Views: 322
  • DeYwq9aV0AA-0kV.jpg
    DeYwq9aV0AA-0kV.jpg
    58.6 KB · Views: 321
  • #226
Oops!
 
  • #227
StoneTemplePython said:
If you look at the design and analysis of speed limits for instance, you'll see that policy makers know that more people will die with higher speed limits.
This is not in the same class of risk as what we are discussing, for exactly the reason you underlined: it's a known risk. Every driver, every other driver, every pedestrian, politician, automotive engineer and automotive executive knows that excessive speed causes accidents. And every one of them can take assertive steps to mitigate that risk or choose to accept it as is.

Self-driving cars present *unknown* risks. In many cases nobody knows what the risks are. In this particular incident, however, there was a clear risk that should have been addressed by Uber. But failing that, there was absolutely no way for this particular pedestrian to have known that this particular car/driver carried a vastly higher risk of hitting her than an average car/driver. And while I agree with @Vanadium 50 that a detailed risk analysis should carried out for these cars/tests (if such testing is even allowed), this particular situation is far too obvious for it to have even been reasonable to get that far. This specific risk is literally the first two rules of safe driving on the first link I clicked looking for such rules:
https://www.nationwide.com/driving-safety-tips.jsp
  • Keep 100% of your attention on driving at all times – no multi-tasking.
  • Don’t use your phone or any other electronic device while driving.
And just to circle back to your example, the third on the list:
  • Slow down. Speeding gives you less time to react and increases the severity of an accident.
So again; it is in my opinion totally unreasonable - and therefore in my opinion grossly negligent - that Uber did not take appropriate steps to mitigate this risk. And it is even worse for being a risk that Uber forced on an unknowing public.

And really, this should not be debatable. I can't fathom that someone would think it should be acceptable for Uber to be violating basic safe driving rules - and again, it disturbs me that I think I'm seeing people arguing that what Uber did is acceptable. And on just how bad it is, we don't have to argue the minutiae of levels of negligence: people *do* get arrested and charged with forms of homicide for killing people when driving while distracted. It's a real thing.
 
  • Like
Likes Vanadium 50 and berkeman
  • #228
256bits said:
No driver watches the road 100% of the time, so labelling what she was doing a distraction then necessarily implicates each and every other driver as being distracted when they read road signs, check the instrument panel, and do all sort of other things such as talking to passengers or adjusting the climate control.
I'll be more explicit this time: did you read the report? Because this response is just plain absurd. This driver was "distracted" by definition. What the driver was doing is explicitly described as a distracted driving action.
 
  • #229
mfb said:
I disagree. Every car manufacturer knows that over the course of their product lifetime one of their cars will be involved in a fatal accident with basically 100% probability. Does it make producing these cars criminal behavior? Clearly not. But we can make an even stronger statement: There are cars that do not have an automatic braking capability. I'm sure you can calculate that including such a feature will save at least one life with near certainty. Does it make producing cars without automated braking capability a crime? I doubt that.
Like @256bits you aren't dealing with the scenario for what it was. You are not describing what actually happened. What Uber did is *explicitly* illegal in most jurisdictions; in Arizona, it just happens to be not on the books as an explicit law, so it is just explicitly against normal safe driving practice.
I see criminal behavior if the risk to kill someone is significantly increased compared to driving normally. A safety driver who mainly looks at a screen combined with a flawed software* might lead to such a situation - but we don't know that.
What do you mean? How do we not know it? It's explicitly illegal in most jurisdictions and it's rules #1 & 2 of safe driving! Why? Because it is a significantly higher risk. How much higher? We could probably calculate it, but I'll guess for a start that it's 20,000% riskier. That's based on average drivers being able to avoid this accident 999 times out of 1000 and based on the fraction of time this driver was looking away from the road, this driver would only have been able to avoid this accident 4 times out of 5.
How many fatal accidents would human drivers have made over the course of Uber's test? Do you know that number? I don't.
Broad/overall statistics have nothing to do with crimes. You can't argue your way out of a speeding ticket by claiming that most of the time you don't speed, even if it is true!
Do you calculate the risk to kill someone every time you drive with a car? If not, would that automatically mean you behaved criminally if you get involved in a fatal accident?
That just isn't how it works and you must know this. Rules exist because other people have calculated the risks for us drivers and determined what is and isn't acceptable risks. You can't get out of a DUI by saying; "It's ok officer; I've tested myself and I still drive acceptably safely with a 0.85 BAC."
Cars kill people. The more cars you make the more people will be killed by these cars. That is an obvious truth. "There is a high chance it will be involved in at least one fatal accident" on its own is not argument on its own. You have to consider the relative rate.
While you put that in quotes, nobody arguing in favor of criminal charges is making that argument; it's a straw man. Maybe you got that from V50's statement that Uber should be doing risk analysis, but this particular case goes beyond risk analysis: Uber engaged in known unsafe and typically illegal behavior. And someone died because of it.
 
Last edited:
  • Like
Likes Vanadium 50 and berkeman
  • #230
mfb said:
We have many types of accidents that are not clearly the fault of some person. I don't think one more type will be a big issue.
What we have here is - so far - about half a dozen new types that are clearly the fault of some people.

However, the specific case we are discussing doesn't even have anything directly to do with self-driving cars. At its core, what we have is a driver who hit a pedestrian because she was looking at a computer when she should have been braking.
 
  • #231
Here are several cases of charges or convictions for manslaughter or similar offenses due to distracted driving:
http://distracteddriveraccidents.com/distracted-driver-gets-manslaughter/
https://abcnews.go.com/US/story?id=93561&page=1
http://www.kctv5.com/story/17587020/teen-charged-with-manslaughter-in-texting-while-driving-case
http://www.foxnews.com/us/2017/08/03/woman-indicted-in-deadly-texting-and-driving-case.html

In some of these cases the distracted act is explicitly illegal and in some cases it is not (all of these involved talking or texting). Based on what I'm seeing, including legal advice such as the below, such charges are pretty much the standard outcome of similar cases:
https://www.pittsburghcriminalattorney.com/can-texting-driving-lead-murder-charge/
"...most cases of death due to distracted driving would be classified as homicide by vehicle..."
 
  • #232
On the TV news just now...

"...accident involved the self-driving mode of the Tesla vehicle..."

http://abc7news.com/tesla-on-autopilot-crashes-into-parked-police-car/3539142/

"Tesla emphasizes that the driver should keep their hands on the steering wheel at all times and pay attention to the road..."

Self driving?

(EDIT -- the quoted text was from the TV news report, not necessarily from the linked news report)
 
  • #233
@russ_watters: What exactly makes you so sure about the higher risk - so sure you put some huge number on it?
Maybe the risk was lower and it was just bad luck? Maybe the risk was lower and it was not bad luck - human drivers would have killed more than one person during the test?

If you are so sure that the risk was higher you must have access to some source I do not. Where did you get that risk estimate from? What is the probability that the car does not correctly brake in such a situation? What is the probability for an average human driver?
You claimed 99.9% for the human driver. Okay, let's go with that. Now we need an estimate for the self-driving car. If you say the risk is 20,000% higher and the human prevents 4 of 5 accidents you must claim the car would never brake for pedestrians. That claim is blatantly absurd. It does not make an emergency braking, but no emergency braking was necessary here for several seconds.

russ_watters said:
I can't fathom that someone would think it should be acceptable for Uber to be violating basic safe driving rules - and again, it disturbs me that I think I'm seeing people arguing that what Uber did is acceptable.
If you are a passenger in a car you are not required to pay attention 100% of the time either. I think that is the best analogy here. The car was driving, and the car did pay attention - it just did the wrong action for reasons that are investigated. So do humans once in a while.
I don't think what Uber did was good, Uber should use *two* entities paying attention (car and passenger on driver seat), but I don't think your arguments do anything to show that.
russ_watters said:
It's explicitly illegal in most jurisdictions and it's rules #1 & 2 of safe driving!
It is illegal if you are the driver. That's the point of driverless cars: You are not the driver any more.
Also, "it is illegal at some other place in the world" doesn't make an action illegal.
russ_watters said:
Cars kill people. The more cars you make the more people will be killed by these cars. That is an obvious truth. "There is a high chance it will be involved in at least one fatal accident" on its own is not argument on its own. You have to consider the relative rate.
While you put that in quotes, nobody arguing in favor of criminal charges is making that argument; it's a straw man. Maybe you got that from V50's statement that Uber should be doing risk analysis, but this particular case goes beyond risk analysis: Uber engaged in known unsafe and typically illegal behavior. And someone died because of it.
Vanadium50 was making that argument. Here is the exact statement:
Vanadium 50 said:
I hope we can agree that if Uber calculated that the risk of killing someone over the course of their testing was 99%, that is criminal behavior.
berkeman said:
Self driving?
Teslas are not self-driving.
 
  • Like
Likes StoneTemplePython
  • #234
russ_watters said:
This is not in the same class of risk as what we are discussing, for exactly the reason you underlined: it's a known risk. Every driver, every other driver, every pedestrian, politician, automotive engineer and automotive executive knows that excessive speed causes accidents. And every one of them can take assertive steps to mitigate that risk or choose to accept it as is.

Self-driving cars present *unknown* risks... In many cases nobody knows what the risks are. In this particular incident, however, there was a clear risk that should have been addressed by Uber. But failing that, there was absolutely no way for this particular pedestrian to have known that this particular car/driver carried a vastly higher risk of hitting her than an average car/driver...

Sorry amigo, but this feels like a framing problem, and awfully close to the overly literal perfect information argument that you occasionally hear from (bad) economists, that I didn't think you'd buy into...?

It also doesn't hold water. Suppose for a contradiction that I actually like this argument:

I may think, for example, pedestrians in big cities with dangerously high of speed limits may mitigate mortality risks by moving elsewhere, refusing to walk (or even drive as speed limts are too high), etc. or "choose to accept it as is". It's all their choice right? If the individual doesn't agree, he/she could certainly not accept it and move to a different city or state or country.

In this case, same thing: move to one of the numerous places that is known to not allow self driving cars. Hence there is a contradiction because the public knows whether self driving cars are allowed in their city, and if not specifically about their city, then they know whether self driving cars exist in their state, and if not in their state, then in their country...
russ_watters said:
So again; it is in my opinion totally unreasonable - and therefore in my opinion grossly negligent... And on just how bad it is, we don't have to argue the minutiae of levels of negligence: people *do* get arrested and charged with forms of homicide for killing people when driving while distracted. It's a real thing.

Look, I'm not interested in getting deep into the details of the specific crash. From what I've read from you and Vanadium, I'm not happy about it. My point is that there is a massive gap between almost all forms of negligence and criminal negligence, esp. criminally negligent homicide. I've been on more than a couple conference calls about criminal negligence with very expensive lawyers re: one of the largest corp disasters over last 30 years. And yes there was a body count greater than this one and yes in the US. It's a very high bar to clear. I'm not saying the hurdle is never surmounted. But I don't think you understand the legal issues here and are wildly overconfident that it exceeds hurdle required for criminal negligence. You have strong opinion on a legal matter, but you don't have a legal opinion... (or anything particularly 'close')
 
  • #235
russ_watters said:
While you put that in quotes, nobody arguing in favor of criminal charges is making that argument; it's a straw man. Maybe you got that from V50's statement that Uber should be doing risk analysis, but this particular case goes beyond risk analysis: Uber engaged in known unsafe and typically illegal behavior. And someone died because of it.
If one looks at the news there are many instances of a product that has caused harm to consumers and no one is charged with criminal intent.
Such as choking hazards for toys for children, contaminated lettuce, saws, ladders, electrocution. Pretty much everything that humans manufacture, build, produce, cultivate, package, service and sell has a casualty risk factor associated with it, either through regular use, defects and/or improper use.
In the natural world, casualties and harm from natural effects are Acts of God, and there is no one to charge or sue for the misfortune.
Once a human has 'touched' a product in any which way or form, I agree, there is an assumed responsibility associated with how much human manipulation has gone into the product.
Liability is not criminal if there is no intent to knowingly harm a fellow human, or their possessions.
Liability does incur if the product does harm a fellow human or their possessions.
That is where we disagree.
I think the owner of the car should be sued for an unforeseen defect in the vehicle causing harm to a fellow human, but disagree that there was criminal intent.
I say sue the pants off them for the defect. Fix it and make it better.

( I can't wait to when the cars designated as self-driving, which they are not, are put on the market as a taxi service with the same? defect with no human supervision. Had this accident not happened in the early ( middle, late ? ) stages of testing, we would be seeing cars driving around without a passenger or driver, being promoted as being completely safe. Will we still see that? )
 
  • #236
PS.
If this car does go on market with the same defect, and labeled self-driving, then I will completely concur - criminal intent.
Same defect meaning only slight tweaking and not a major re-adjustment of the AI code and sensor interaction.
 
  • #237
mfb said:
Teslas are not self-driving.
Sorry, I'm missing the distinction. What is the difference between self-driving and autopilot?
 
  • #238
mfb said:
What exactly makes you so sure about the higher risk

Cars kill on average 12.5 people per billion miles driven. Uber self-driving cars kill on average 500 people per billion miles driven. You can reject that hypothesis that Uber cars are no less safe than human driven cars to >95% on the numbers we have.

For non-fatal accidents the rate is 6 per million miles driven. Waymo has 25 accidents in 5 million miles total - but if you look at the last three million miles, they have only had one: a rate that is 18x safer.

We can argue about small statistics, but the fact that in one case they are seeing a rate 40x higher and the other 18x lower says something. Uber had a product that is, to the best of our knowledge 720x more dangerous than Waymo's (and yes, 720 might be 500 or 1000), and because they wanted to beat Waymo in the market, tested it on an unsuspecting populace, and sure enough, they killed someone.
 
  • Like
Likes atyy
  • #239
As a PS being legally drunk increases your odds of an accident by a factor of 4-5, depending on what legally drunk is in your jurisdiction. Compare that to a factor of 18.
 
  • #240
In 1921 first year this statistics may have been available there were 21 fatalities per 100 M vehicle miles vs 2016 with 1.18 fatalities per 100 M miles. Is it fair to compare AV in the early stages of development and public familiarity with vehicles that have been in the public domain for almost a century.

While Uber made a mistake in allowing that vehicle on the road how do you compare that with current auto manufacturers who allow defective car to be used while fatalities accumulate. One manufacturer was suppose have had the policy that it was cheaper to settle law suites than repair (recall) the cars affected.
 
  • #241
gleem said:
In 1921 first year this statistics may have been available there were 21 fatalities per 100 M vehicle miles vs 2016 with 1.18 fatalities per 100 M miles. Is it fair to compare AV in the early stages of development and public familiarity with vehicles that have been in the public domain for almost a century.

Uber presently has a rate 2.4x higher than that.

gleem said:
One manufacturer was suppose have had the policy that it was cheaper to settle law suites than repair (recall) the cars affected.

Yes, that was Ford, and they were charged criminally for that, and I was the one who brought it up.
 
  • #242
As a pedestrian in Arizona you are more likely to be killed than any other state as a percentage of the population, 1.61 fatalities/100 K vs a national average of 0.81. 75% occur in the dark. The four most populous stated NY, CA, TX, FL have the most fatalities but AZ ranked 16 in pop comes in next in fatalities. Together these 5 states account for 43% of national pedestrian fatalities. PA the fifth most populous state only has 0.49 fatalities/100K. The most fatalities in a county in the US is in Maracopa Co. AZ which is where Tempe is. Uber could have reduced their exposure to untoward incidents by not testing their cars in AZ.

Govenors Highway Safety Report https://www.ghsa.org/sites/default/files/2018-02/pedestrians18.pdf
 
  • Like
Likes atyy
  • #243
gleem said:
Uber could have reduced their exposure to untoward incidents by not testing their cars in AZ.

And yet they chose not to.
 
  • #244
Over coarse on the other hand if they had carried out their tests in Hawaii this accident would have accounted for 50% of Hawaii fatalities this year instead of just 0.9%.in Arizona.
 
  • Like
Likes atyy
  • #245
berkeman said:
Sorry, I'm missing the distinction. What is the difference between self-driving and autopilot?
A self-driving car is a car that doesn't need a human driver. The Tesla autopilot controls the speed and steering in some conditions but cannot handle all traffic situations, hence the need for the human to pay attention the whole time. Tesla cars tell you that clearly before you can use the autopilot. If you don't pay attention as driver (!) it is your fault.
Vanadium 50 said:
Cars kill on average 12.5 people per billion miles driven. Uber self-driving cars kill on average 500 people per billion miles driven. You can reject that hypothesis that Uber cars are no less safe than human driven cars to >95% on the numbers we have.

For non-fatal accidents the rate is 6 per million miles driven. Waymo has 25 accidents in 5 million miles total - but if you look at the last three million miles, they have only had one: a rate that is 18x safer.

We can argue about small statistics, but the fact that in one case they are seeing a rate 40x higher and the other 18x lower says something. Uber had a product that is, to the best of our knowledge 720x more dangerous than Waymo's (and yes, 720 might be 500 or 1000), and because they wanted to beat Waymo in the market, tested it on an unsuspecting populace, and sure enough, they killed someone.
Numbers! Thanks.
Uber had 2 million miles driven by the end of 2017. At the same number of miles Waymo had 24 non-fatal accidents, or 12 times higher than the general rate. If they had the same higher risk for fatal accidents (150 per billion miles) they had a 30% chance of a fatal accident within these 2 million miles. Maybe they were just luckier. You know how problematic it is to draw statistical conclusions based on a single event or the absence of it.

I see two clear conclusions based on the numbers here:
* Waymo reduced its nonfatal accident rate over time, and it is now below the rate of human drivers
* The ratios "Uber fatal accident rate to human fatal accident rate in the first 2 million miles" and "Waymo nonfatal accident rate to human nonfatal accident rate in miles 2 million to 5 million" are significantly different. I'm not sure how much that comparison tells us.

Another thing to note: To demonstrate a lower rate of fatal accidents, Waymo will have to drive ~250 million miles, 50 times their current dataset, assuming no fatal accident.

All this assumes the driving profiles for the cars and for humans are not too different. If the cars drive more or less frequent in the dark, more or less frequent on the highway (lower accident rates but more accidents are fatal) or similar the numbers might change.
 

Similar threads

Replies
123
Views
11K
Replies
19
Views
11K
Replies
1
Views
9K
Replies
13
Views
3K
Back
Top