How Safe Are Self-Driving Cars After the First Fatal Accident?

  • Thread starter Dr. Courtney
  • Start date
  • Tags
    Car Self
In summary, a self-driving Uber car struck and killed a pedestrian in Arizona. The small experimental installed base of self-driving cars raises concerns with the technology, and the tragedy will be scrutinized like no other autonomous vehicle interaction in the past.
  • #141
OmCheeto said:
I don't have a problem with Jaywalkers. I do have a problem with people who don't look both ways before crossing the street.

I have a major problem with Jaywalkers. What is a 'Jay'.
http://www.todayifoundout.com/index.php/2012/07/origin-of-the-term-jaywalking/
Contrary to popular belief, the term jaywalking does not derive from the shape of the letter “J” (referencing the path a jaywalker might travel when crossing a road). Rather, it comes from the fact that “Jay” used to be a generic term for someone who was an idiot, dull, rube, unsophisticated, poor, or simpleton. More precisely, it was once a common term for “country bumpkins” or “hicks”, usually seen incorrectly as inherently stupid by “city” folk.
jaywalking-340x410.jpg
 

Attachments

  • jaywalking-340x410.jpg
    jaywalking-340x410.jpg
    31.5 KB · Views: 368
  • Like
Likes OmCheeto
Physics news on Phys.org
  • #142
russ_watters said:
the "safety driver" and car should have easily been able to avoid thi

I don't think they had a safety driver. To really be a backup, she needs to be driving all the time, with the inputs disabled, and she needs to be able to reengage at a moment's notice. If she's sitting in the front seat doing other duties, she's not a safety driver.

dipole said:
was a convicted felon who spent time in jail for armed robbery...

Sounds like they fit the corporate culture perfectly. :wink:
 
  • #143
nsaspook said:
I have a major problem with Jaywalkers. What is a 'Jay'.
...

...it comes from the fact that “Jay” used to be a generic term for someone who was an idiot,
...

Ok. You got me there. I was trying to be polite.
 
  • Like
Likes nsaspook
  • #144

Attachments

  • 518a6cedb3fc4bb6e600001c_the-moscow-affair_grim_reaper-1000x493.0.jpg
    518a6cedb3fc4bb6e600001c_the-moscow-affair_grim_reaper-1000x493.0.jpg
    94.2 KB · Views: 413
  • #145
Some people have claimed the "notasafetydriver" was looking down, and was therefore part of the problem.
I think it would be an interesting experiment, for everyone to place a camera pointing at their face while driving, even a moderate distance.

I came up with this notion yesterday, when I drove the 2000 feet to my corner convenience store, to pick up some essentials.
At least twice, I had panic attacks, realizing I had been distracted by things to my side (I still don't have a cell phone), taking my eyes off the road for at least a second, and some "Jay"walker might be in front of me!

According to wiki's article on "Braking distance"; "A perception-reaction time of 1.5 seconds, and a coefficient of kinetic friction of 0.7 are standard for the purpose of determining a bare baseline for accident reconstruction and judicial notice; most people can stop slightly sooner under ideal conditions."

As far as I can tell, had I been the driver in the Uber car, I would probably also have smooshed that lady, even in non-autonomous mode.
 
  • Like
Likes nsaspook
  • #146
That said, I am Trevor Noah's mom.

My younger brother once claimed that I drove like a paranoid schizophrenic.
Last time I rode with him, I decided he was a Jay-driver.
 
  • Like
Likes collinsmark
  • #147
OmCheeto said:
Some people have claimed the "notasafetydriver" was looking down, and was therefore part of the problem.

This GM car won't need any of that.
http://media.chevrolet.com/media/us...Pages/news/us/en/2018/jan/0112-cruise-av.html
General Motors filed a Safety Petition with the Department of Transportation for its fourth-generation self-driving Cruise AV, the first production-ready vehicle built from the start to operate safely on its own, with no driver, steering wheel, pedals or manual controls.
 

Attachments

  • 1515689857108.jpg
    1515689857108.jpg
    32 KB · Views: 288
  • #148
nsaspook said:

As an old person, that's kind of uncomfortably "freaky".

Though, having lived through:
1960's: Computers will be cool
through
2010's: Computers are incomprehensibly cool​

and having sat in the seat while my sister, another Jaydriver, drove me to the coast. :bugeye::oldsurprised:

I think I would be more comfortable getting in the GM car.
 
  • Like
Likes nsaspook
  • #149
OmCheeto said:
My younger brother once claimed that I drove like a paranoid schizophrenic.
Last time I rode with him, I decided he was a Jay-driver.

Big deal.
A friend of mine said I drive like Mario Andretti!
 
  • #150
BillTre said:
Big deal.
A friend of mine said I drive like Mario Andretti!

I hope not. :wideeyed:
 
  • Like
Likes Spinnor
  • #151
BillTre said:
Big deal.
A friend of mine said I drive like Mario Andretti!
Was his name "Karl"?
If so, I know him.

my Karl: Bus rider. Not used to being in a car.
 
  • #152
OmCheeto said:
Was his name "Karl"?
If so, I know him.

my Karl: Bus rider. Not used to being in a car.
Mani, guy from India.
Enjoyed my driving.
 
  • Like
Likes OmCheeto
  • #153
I heard on the news report this morning driving to work that the Uber car was not in self-driving mode at the time of the crash, but I haven't been able to find any reference to that now that I'm at work on my PC. Has anybody else seen anything about this? It was CBS news radio...
 
  • #154
berkeman said:
I heard on the news report this morning driving to work that the Uber car was not in self-driving mode at the time of the crash

If that came from Uber, that's the same company who blamed (and fired) the safety drivers for running red lights in San Francisco, when in fact it was their software.
 
  • #155
berkeman said:
I heard on the news report this morning driving to work that the Uber car was not in self-driving mode at the time of the crash, but I haven't been able to find any reference to that now that I'm at work on my PC. Has anybody else seen anything about this? It was CBS news radio...

I've seen nothing about it but is anyone really surprised that the 'safety' driver was not 100% on visual task? I'm pretty shocked by the amount of abuse this person has been getting online and in the media about their past history and employment by Uber. We are good drivers when we’re vigilant. But we’re terrible at being vigilant.
http://journals.sagepub.com/doi/pdf/10.1080/17470214808416738
The General Problem. The deterioration in human performance resulting from adverse working conditions has naturally been one of the most widely studied of all psychological problems. Amongst other possibilities, the stress arising from an unusual environment may be due either to physico-chemical abnormalities in the surroundings or to an undue prolongation of the task itself. This paper is concerned with the latter form of stress, as it has been found to occur in one particular type of visual situation; a later publication will more fully discuss the implications of these and other visual and auditory experiments (Nlackworth, 1948)
 
  • #156
If the problem is the result a LiDAR malfunction remember why this might have happened to Uber.

https://medium.com/waymo/a-note-on-our-lawsuit-against-otto-and-uber-86f4f98902a1
One of the most powerful parts of our self-driving technology is our custom-built LiDAR — or “Light Detection and Ranging.” LiDAR works by bouncing millions of laser beams off surrounding objects and measuring how long it takes for the light to reflect, painting a 3D picture of the world. LiDAR is critical to detecting and measuring the shape, speed and movement of objects like cyclists, vehicles and pedestrians.

Hundreds of Waymo engineers have spent thousands of hours, and our company has invested millions of dollars to design a highly specialized and unique LiDAR system. Waymo engineers have driven down the cost of LiDAR dramatically even as we’ve improved the quality and reliability of its performance. The configuration and specifications of our LiDAR sensors are unique to Waymo. Misappropriating this technology is akin to stealing a secret recipe from a beverage company.

https://www.theregister.co.uk/2018/02/09/waymo_uber_settlement/
We have reached an agreement with Uber that we believe will protect Waymo’s intellectual property now and into the future. We are committed to working with Uber to make sure that each company develops its own technology. This includes an agreement to ensure that any Waymo confidential information is not being incorporated in Uber Advanced Technologies Group hardware and software. We have always believed competition should be fueled by innovation in the labs and on the roads and we look forward to bringing fully self-driving cars to the world.

https://www.reuters.com/article/us-...er-self-driving-incident-safely-idUSKBN1H1006
LAS VEGAS (Reuters) - The head of Alphabet Inc’s autonomous driving unit, Waymo, said on Saturday that the company’s technology would have safely handled the situation confronting an Uber self-driving vehicle last week when it struck a pedestrian, killing her.
 
Last edited:
  • Like
Likes Spinnor
  • #157
HAYAO said:
My condolences to the family who lost one of their important members.

Or simply an unavoidable accident (for example the pedestrian tripped over something and fell on the road)?

She didn't tripped. I saw the dash cam footage. She came out from shadows. The car didn't even try to brake as it seemed, so clearly the car didn't identify the lady. Driver was looking down somewhere not watching ahead so he missed it too. But the way it happened, I mean it happened very quickly. As I saw in the video, it seemed that even if the driver saw her and immediately braked, he would not have been able to stop the car before it hit her because the car was traveling too fast for such a short distance stop. BUT, that impact would've been lower which might have saved her life with severe injuries. But the driver was not attentive and car didn't see her so no braking happened before the car hit the lady it seemed.
 
  • #158
https://www.theverge.com/2018/3/27/17168606/nvidia-suspends-self-driving-test-uber
Uber has been using Nvidia’s self-driving technology in its autonomous test cars for a while, though the companies only just started to talk about it earlier this year. Uber has said it would use Nvidia’s tech in its eventual self-driving fleets of Volvos as well as the company’s autonomous trucks. But Uber has also halted its AV testing in all the cities in which it operates, and the governor of Arizona suspended the company from testing its self-driving cars in the state “indefinitely.”

https://www.engadget.com/2018/03/27/nvidia-self-driving-virtual-simulation/
 
  • #161
Grands said:
That's kind of off-topic for this thread. This thread is about the failure of the self-driving car to see the pedestrian that it hit at night. The links you posted are for a completely different accident (that happened about 10 miles from me) where the driver hit a center divider. You can start a separate thread about that accident if you like -- it's not clear that the car was in self-driving mode, IIRC.
 
  • Like
Likes Grands
  • #162
One question I think is what do we mean when we ask how safe self-driving cars are.

I think deaths per mile isn't a very meaningful measure. For example, suppose in the normal use case of day to day driving, a SDC will have an accident at a rate that is 50% less than a human, and let's define accident as any incident in which people or property are damaged. This certainly seems "safe".

However, suppose in the case of rare events, such as a pedestrian's trajectory intersects that of the cars, the SDC has an accident rate that approaches 90% - meaning, in 90% of such rare events, the car will have an accident. And let's assume for humans that this rate is low, comparable to the overall rate of accidents.

If these events are rare enough, then the total accidents per unit mile of an SDC may well be less than that of human drivers. However, if SDCs have known flaws where they will likely fail, such as consistently plowing into pedestrians, then in those particular instances they should be considered extremely unsafe.

Does a SDC need to outperform humans in all possible cases to be safe? If, on occasion, a SDC will plow down pedestrians (including children), but in the most common scenarios perform better than humans, resulting in overall fewer deaths per mile, are they safe or not?
 
  • #163
OmCheeto said:
My younger brother once claimed that I drove like a paranoid schizophrenic
I am surprised that a paranoid schizophrenic would have any chance of passing a driving test.
 
  • #164
rootone said:
I am surprised that a paranoid schizophrenic would have any chance of passing a driving test.

Why? I don't think "inability to parallel park" is listed as a symptom in DSM-5.
 
  • #165
While - a few pages ago I noted that there are 15 pedestrian deaths every day - I was also thinking today that; many cars today have a "breaking override" functionality as part of their safety package where the vehicle applies the brakes on it's own, without driver interaction. This exact technology IS applicable to self-driving - and we are not discussing how many lives have (possibly) been saved by that... I have seen no one promote the data regarding lives saved by this.

Furthermore we have no way of comparing this case to if it had been a human driver...

Taking one (tragic) failure and speculating on the issues based on the video, which we get to stop and analyze, Monday morning quarterdeck style and none of us have direct access to all of the details is counter productive - we are letting our personal opinion and emotion overrule good critical analysis. It falls into populist mindset that results in flat Earth and anti-vaxer movements.

My opinion is based largely on industrial robotics - a MAJOR benefit is Safety - honestly.. humans are AWFUL when it comes to process consistency and reliability - you can not re-train them effectively - they are fatigued, they are distracted, they just do not care... just because YOU think you are better then the average driver... IMO I really do not believe it...

And again - I am NOT for preventing you from driving, mostly because if you are enjoying it your are engaged, but 99% of the driving is just a utilitarian pursuit - get me from point A to point B...
 
  • #166
Vanadium 50 said:
Why? I don't think "inability to parallel park" is listed as a symptom in DSM-5.
Yes but attempts to avoid collisions with imaginary objects would be.
 
  • #167
dipole said:
Does a SDC need to outperform humans in all possible cases to be safe?
Requiring this would kill thousands or tens of thousands of people.
Imagine the opposite direction: If driverless cars would be the default, would we switch to human drivers if they demonstrate to be better at recognizing polar bears on wet roads, but perform worse in all other cases?
 
  • Like
Likes nsaspook
  • #168
mfb said:
Requiring this would kill thousands or tens of thousands of people.
Imagine the opposite direction: If driverless cars would be the default, would we switch to human drivers if they demonstrate to be better at recognizing polar bears on wet roads, but perform worse in all other cases?

Yes, safe is not perfection but what is safe in these cases?
A 'safe' human driver can hit and kill a pedestrian under exactly this conditions and not be charged with a crime or even given a traffic ticket.

https://www.usatoday.com/story/opin...-needs-set-safety-standards-column/460114002/
The risk of premature deployment is well-understood: Public welfare is harmed by the deployment of unsafe technology. However, delayed deployment would deprive society of the improved safety benefits this technology promises. Without guidance, companies will inevitably err both in being too aggressive and deploying prematurely, or in being too cautious and continuing to privately develop the technology even once it is safe enough to benefit the public. Neither scenario is in the public interest.

For the driver safety aspect.
https://www.tesla.com/en_EU/blog/update-last-week’s-accident?redirect=no
In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.
...
Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.
 
  • #169
berkeman said:
I heard on the news report this morning driving to work that the Uber car was not in self-driving mode at the time of the crash, but I haven't been able to find any reference to that now that I'm at work on my PC. Has anybody else seen anything about this? It was CBS news radio...

Maybe this is what you heard. It's logical that the embedded Volvo XC90’s driver-assistance system would be disabled to have full control during testing.
https://www.bloomberg.com/news/arti...-suv-s-standard-safety-system-before-fatality
“We don’t want people to be confused or think it was a failure of the technology that we supply for Volvo, because that’s not the case,” Aptiv PLC spokesman Zach Peterson told Bloomberg. “The Volvo XC90’s standard advanced driver-assistance system ‘has nothing to do’ with the Uber test vehicle’s autonomous driving system.”
...
Mobileye, which produces the sensor chips in the safety systems supplied to Aptiv PLC, told Bloomberg it tested the software Monday following the crash by watching the footage of the accident. The company said the software “was able to detect Herzberg one second before impact in its internal tests” despite the video’s poor visual quality.
 
  • #170
How do people feel about autonomous cars being deliberately allowed to hit things? I'm not talking about people but things like overhanging tree branches. In parts of the UK there are many single track roads with passing places. It's quite common for cars to brush over hanging branches on such roads. Is this something that's going to be easy for AI to accommodate? What happens when two autonomous car meet on such a road? Anyone testing this or is it all being done in American cities?
 
  • #171
CWatters said:
How do people feel about autonomous cars being deliberately allowed to hit things? I'm not talking about people but things like overhanging tree branches. In parts of the UK there are many single track roads with passing places. It's quite common for cars to brush over hanging branches on such roads. Is this something that's going to be easy for AI to accommodate? What happens when two autonomous car meet on such a road? Anyone testing this or is it all being done in American cities?

That will be tricky for the classifier part of object recognition to integrate in the driving prediction plan like 'seeing' the flying bird or floating trash bag in front of the car. I'd be more worried about how quickly human drivers and pedestrians will bully self-driving cars in traffic to make them give way in just about any situation.

https://spectrum.ieee.org/transport...e-big-problem-with-selfdriving-cars-is-people
It’s not hard to see how this could lead to real contempt for cars with level-4 and level-5 autonomy. It will come from pedestrians and human drivers in urban areas. And people will not be shy about expressing that contempt. In private conversations with me, at least one manufacturer is afraid that human drivers will bully self-driving cars operating with level-2 autonomy, so the engineers are taking care that their level-3 test cars look the same as conventional models.

Bullying can go both ways, of course. The flip side of socially clueless autonomous cars is the owners of such cars taking the opportunity to be antisocial themselves.
MjkyOTIwMg.jpg


One person walking like this could slow traffic to human walking speed because would you risk a possible human collision if you designed the automation to always be safe and never take risks people do everyday while driving in traffic.
 

Attachments

  • MjkyOTIwMg.jpg
    MjkyOTIwMg.jpg
    35.9 KB · Views: 395
  • Like
Likes CWatters
  • #172
+1

In such situations it might be something simple such as making eye contact with the person or a small gesture that makes the difference between you having to waiting or pass.
 
  • #173
CWatters said:
+1

In such situations it might be something simple such as making eye contact with the person or a small gesture that makes the difference between you having to waiting or pass.

I have my own solution for pedestrian bullies.
 
  • #174
I can think of some easy and devious ways to fool driving AI systems. Tricking self-driving cars could become a national pastime, like wearing a cycling jersey with a full sized STOP sign on the back.

https://arxiv.org/pdf/1802.08195.pdf
Machine learning models are vulnerable to adversarial examples: small changes to images can cause computer vision models to make mistakes such as identifying a school bus as an ostrich. However, it is still an open question whether humans are prone to similar mistakes. Here, we create the first adversarial examples designed to fool humans, by leveraging recent techniques that transfer adversarial examples from computer vision models with known parameters and architecture to other models with unknown parameters and architecture, and by modifying models to more closely match the initial processing of the human visual system. We find that adversarial examples that strongly transfer across computer vision models influence the classifications made by time-limited human observers.

Now that researchers have found trivial ways to hack deep learned visions system they are turning their attention to humans.
 
  • #175
You can mess with human drivers in many ways as well. Many of these ways are illegal, some ways specific to autonomous cars might become illegal, but overall this is not a big issue today, I don’t see how it would become one in the future.
 

Similar threads

Replies
123
Views
11K
Replies
19
Views
11K
Replies
1
Views
9K
Replies
13
Views
3K
Back
Top