Autonomous cars aren't perfect, but how safe must they be?

March 17, 2016 by Tom Krisher And Justin Pritchard
Autonomous cars aren't perfect, but how safe must they be?
In this Wednesday, May 13, 2015, file photo, Google's self-driving Lexus car drives along street during a demonstration at Google campus on in Mountain View, Calif. As Google cars encounter more and more of the obstacles and conditions that befuddle human drivers, the autonomous vehicles are likely to cause more accidents, such as a recent low-speed collision with a bus. (AP Photo/Tony Avelar, File)

As autonomous car technology rapidly progresses, makers of the cars face the difficult question of how safe they must be before they're ready to move people on highways and city streets.

Right now, companies such as Google, Audi, and Mercedes-Benz are testing the cars in a small number of cities to demonstrate they can be safer than human drivers. They also must figure out what level of risk is acceptable to both government regulators and a potentially skeptical public.

Government statistics show that human mistakes are responsible for 94 percent of the 33,000 traffic fatalities each year. Autonomous cars won't get drowsy, distracted or drunk, so in theory they could eliminate those mistakes and save an estimated 31,000 lives a year.

But as a Valentine's Day fender-bender involving a Google autonomous Lexus and a public bus shows, cars that drive themselves can make mistakes.

"We cannot expect any technology, any solution to be perfect all the time," says Raj Rajkumar, a computer engineering professor at Carnegie Mellon University who has led autonomous vehicle research for 15 years. "We live in a very uncertain world where lots of things happen."

Given that, regulators and would-be passengers may have to accept that the cars will cause a limited number of crashes, including deadly ones, if overall they save thousands of lives.

"We should be concerned about automated vehicles," says Bryant Walker Smith, a University of South Carolina law professor who studies the technology. "But we should be terrified about today's drivers."

Google is testing a fleet of 56 autonomous cars on the streets of Mountain View, California; Austin, Texas; and Kirkland, Washington. The cars have driven themselves almost 1.5 million miles, with a person as backup in the driver seat. The company also uses a simulator to test the cars in a variety of scenarios. Other companies such as Nissan, software firm Cruise Automation and parts suppliers Bosch and Delphi also are testing on public roads. Test cities also include San Francisco, Las Vegas and Pittsburgh.

Chris Urmson, head of Google's self-driving car program, wrote in a January blog that during the past two years, drivers took control 13 times when its cars likely would have hit something. He noted that the rate of human intervention is dropping and he expects it to keep falling.

In the bus crash, Google for the first time admitted its car was at least partly responsible. The computer and human driver assumed the bus would yield as the car moved around sandbags. Instead, the bus kept going and the car hit its side. Google has updated its software.

In about a dozen other crashes on city streets, Google blamed the human driver of the other vehicle.

Google wants to make cars available to the public around the end of 2019, assuming its data shows the time is right for deployment.

A Virginia Tech University study commissioned by Google found that the company's autonomous cars crashed 3.2 times per million miles compared with 4.2 times for human drivers. But the study had limitations. The human figures were increased to include an estimate of minor crashes that weren't reported to police. All autonomous car crashes in California, however, must be reported. The study also didn't include potential crashes that were avoided when human backup drivers took control.

U.S. traffic deaths have declined steadily for most of the past decade, from 43,510 in 2005 to 32,675 in 2014. But estimates show they spiked 9 percent in the first nine months of last year due mainly to increased miles traveled, texting and other distractions.

If autonomous cars are the answer to sharply reducing those figures, they'll first have to gain the public's trust. A January poll by AAA found that three-quarters of U.S. drivers are "afraid" to ride in an autonomous car. A University of Michigan poll found similar results in Japan, China, India and elsewhere.

Unlike Google, which wants to test cars without human drivers, automakers and parts companies are rolling out autonomous features as they are ready.

The AAA poll found that drivers are somewhat comfortable with the individual features of autonomous driving such as automatic emergency braking. Separate studies have shown those features can cut crashes. The Insurance Institute for Highway Safety says autonomous braking alone would prevent 700,000 rear crashes per year if installed on all cars.

On Thursday, transportation officials and automakers said they agreed to make automatic braking standard in nearly all cars within the next six years.

General Motors has plans to soon test autonomous cars by carrying employees around a technical center near Detroit. Still, John Capp, director of global safety and vehicle programs, sees humans behind the wheel for the foreseeable future.

"We can't afford to tarnish safety by doing experimentation on the roads," he says.

The burden of proof will be on companies to show that the technology is safe, Adam Jonas, a Morgan Stanley auto analyst, told investors recently.

But even with that evidence, some governments may still be reluctant. "No mayor wants to be the first elected official blamed for the death of a citizen by a robot," Jonas wrote.

Yet cities, seeing potential job growth and safety benefits, already are competing for wider use of the cars, he wrote.

In Austin, Mayor Steve Adler says Google cars in his city haven't caused any crashes, and he believes they bring safety benefits. "We don't get perfection with regular drivers either," he says.

Adler sees allowing the cars in small, controlled areas that expand as the cars prove themselves safe.

Colby Huff, a radio host from Springfield, Illinois, wouldn't ride in one. While others would welcome the car handling a daily commute, Huff doesn't think programmers are infallible.

"There's just too much that can go wrong in something that weighs a ton or so," he says. "It's not worth my family's safety to trust a machine."

Explore further: GM buys software company to speed autonomous car development

Related Stories

Autonomous braking to be in most cars by 2022

March 16, 2016

Major automakers and the U.S. government have reached an agreement to make automatic emergency braking standard equipment on most cars by 2022, two people briefed on the deal said.

Recommended for you

Computer learns to recognize sounds by watching video

December 1, 2016

In recent years, computers have gotten remarkably good at recognizing speech and images: Think of the dictation software on most cellphones, or the algorithms that automatically identify people in photos posted to Facebook.

19 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

freeiam
2.5 / 5 (2) Mar 17, 2016
"Government statistics show that human mistakes are responsible for 94 percent of the 33,000 traffic fatalities each year."
What else can be the cause?
But seriously, the number of fatal accidents is suggestive and totally irrelevant, it can only mean something when measured against the total number of miles driven.
It could be that self driving cars perform worse (even in the far future) when the traffic throughput is the same.
But I have the answer to your question (it isn't 42): self driving cars must pass every engineering test thinkable, and above that they must (each car) acquire a drivers license and thus pass the driver turing test.
This means for example that the car must be able to respond to instructions the way a human does and cannot drive in a slow or erratic or uncertain way or hit the break at a random moment.
freeiam
5 / 5 (1) Mar 17, 2016
A much better option, totally ignored in this article is that by far the safest option is to let humans be assisted by this new technology.
Humans are far superior in parsing images and all kinds of other sensory clues, but fail (for example) when they fall asleep. It would be nice if the car assisted the drive when needed, so in this case not hit the tree or the car in the other lane.
freeiam
5 / 5 (1) Mar 17, 2016
A much better option, totally ignored in this article is that by far the safest option is to let humans be assisted by this new technology.
Humans are far superior in parsing images and all kinds of other sensory clues, but fail (for example) when they fall asleep. It would be nice if the car assisted the driver when needed, so in this case not hit the tree or the car in the other lane.
antialias_physorg
5 / 5 (1) Mar 17, 2016
year."
What else can be the cause?

Mechanical failure. Possibly external electronic failure (all traffic lights at an intersection show green). Wildlife accidents. I'm sure there's a few other possible circumstances that can#t be attrivuted to human failure.

it can only mean something when measured against the total number of miles driven.

Assumption being that with autonomous cars the average travelled miles per person will be roughly the same. Which, I think, is a fair first assumption as you travel for a reason. And the reasons don't change with the introduction of autonomous cars.

This means for example that the car must be able to respond to instructions the way a human does

This makes no sense. How often do you have to actually respond to instructions in your daily commute (or ever) when behind the wheel?
antialias_physorg
5 / 5 (2) Mar 17, 2016
A much better option, totally ignored in this article is that by far the safest option is to let humans be assisted by this new technology.

This has been tried but decreases safety in some instances (e.g. acceleration/deceleration assitants for going onto/off highways). Humans aren't capable of discerning when the car is acting within its specified envelope and when it's not. The additional second until you realize something is *way* off is already too much. Better to leave the driving to the car entirely.
That way the engineers don't have the cop out of saying "in a difficult situation the user will intervene anyhow"
If that were the case then driving an autonomous car would be no less stressful than driving a regular vehicle - as you would have to watch the road with equal concentration in both cases.

Humans are far superior in parsing images and all kinds of other sensory clues

Not really. Especially since cars can have many senses humans don't (IR, LIDAR, US)
Scottingham
5 / 5 (3) Mar 17, 2016
People are irrational and stupid. All arguments against (mature)automated driving can be applied against people 4x over.

Personally, I think the 'active safety' features will become standard first, then 'highway autopilot mode', then 'point-point' mode. It'll all happen so gradually that even the most scared people will use it more than they expect to now.
julianpenrod
1 / 5 (3) Mar 17, 2016
When it was a matter of shilling for big business that wanted to force people to buy as much expensive garbage as possible, let greedy government collect on fines and fees and protect stoned drivers from the repercussions of their actions and permit the spread of addiction in society, no amount of safety was "enough". Seat belts; then seat harnesses; baby seats; pet seats; specific positioning for safety seats; air bags in the front, then in the sides, which also made up for the fact cars were being built cheaper and less safely; then they started building cars with illegally narrow rear windows, but they "compensated" with expensive rear view closed circuit television. All based on the idea that the worst possible accident is the only kind that will happen. Now, to promote the swindle of "autonomous cars", they're willing to let the public be exposed to what they would have called inexcusable risk before!
antigoracle
1 / 5 (2) Mar 17, 2016
Absolutely no mention of hacking, which we not only know is doable but will happen.
kochevnik
not rated yet Mar 18, 2016
I rode in Uber shuttle and girl had four near-accidents in 30km
antigoracle
not rated yet Mar 18, 2016
I rode in Uber shuttle and girl had four near-accidents in 30km

Congratulations. Did you pay extra for the thrill or was it free?
rgw
1 / 5 (1) Mar 19, 2016
Self driving cars NEVER! Meteorites will still be unavoidable. For those who doubt autonomous vehicle safety, watch the insurance rates as these safer cars are introduced. If you like paying $400 per month for coverage, drive yourself. If you prefer $40+/- per month, then allow the computer system to take control.
Eikka
not rated yet Mar 20, 2016
This makes no sense. How often do you have to actually respond to instructions in your daily commute (or ever) when behind the wheel?


That's hardly the point. Most have, and being able to respond to instuctions demostrates a level of understanding required to percieve the situation and navigate real traffic in a changing environment.

It's actually an incredibly difficult task for the car computer to respond to "turn left over there", because they would need to independently assess what is "there" without having someone draw a virtual intersection in their internal navigator and plot a line through it for them.

That's because the current crop of self-driving cars are like slot cars running on virtual rails. They know almost -nothing- about their environment that isn't explicitly mapped beforehand. They see but they don't percieve, because they don't understand or reason, so they get in trouble when the environment doesn't match their internal map.
Eikka
not rated yet Mar 20, 2016
The self-driving car must be safer than the median driver.

If it's only as safe as the average driver, then it will actually be less safe for most people.The number of accidents might not change, but the number of victims would.

That's because the number and severity of accidents doesn't divide equally among drivers. There are a small number of repeat offenders and bad drivers who are responsible for most of the accidents - people who drive drunk or on drugs, recklessly, with broken vehicles, or are just incompetent, too old, etc. - which puts the average down for the rest.

That's why comparing accident rates per miles driven over entire fleets is also meaningless: most drivers don't have that many accidents per mile.
Duude
not rated yet Mar 20, 2016
How much liability is Google, or whichever company, willing to assume? That's the answer to how safe must they be?
Duude
not rated yet Mar 20, 2016
The self-driving car must be safer than the median driver.



I wouldn't get in a car with the median driver at the wheel. That's a standard for poor drivers. Spare us all and just take the bus.
Eikka
not rated yet Mar 20, 2016
I wouldn't get in a car with the median driver at the wheel. That's a standard for poor drivers. Spare us all and just take the bus.


You're unlikely to be any better than that yourself.

I wasn't talking about the people who actually drive on the median, but the statistical median, or the half-way point in the group of best to worst of drivers.

The median driver is better than the average driver for a very simple reason I already detailed above.

But if you still don't get it, consider the following. Suppose we have ten drivers who get a driving score from 0-10. Suppose the scores are the following:

10,9,9,9,8,8,7,5,3,2

The average driver of this group gets a score of 7 while the median driver gets a score of 8
Eikka
not rated yet Mar 20, 2016
Point being that the car industry, Google, etc. are going to argue that self-driving cars are safer far sooner than they actually are, because they will be counting by the mean instead of median safety, because that's the lower standard.

Of course it's a moot question because we let completely awful people drive anyways.

It's just something to expect out of corporate business and wishful people, like how the electric car industry started off by measuring range using the least stressful driving cycles possible, which resulted in cars that could technically go 100 miles - at a running pace around a level parking lot.
adam_russell_9615
not rated yet Mar 20, 2016
Insurance companies have a better handle on risk management than anyone. Have any of them committed to insure driverless cars?
Eikka
not rated yet Mar 21, 2016
Insurance companies have a better handle on risk management than anyone. Have any of them committed to insure driverless cars?


Insurance is like warranty: you pay extra for it, and they'll do everything they manage to exempt you from it after the fact.

The risk management they take is not for your risk, but for their risk of losing money.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.