Google self-driving car strikes bus on California street

February 29, 2016 by Justin Pritchard
Google self-driving car strikes bus on California street
In this May 13, 2015, file photo, Google's self-driving Lexus car drives along street during a demonstration at Google campus in Mountain View, Calif. A self-driving car being tested by Google struck a public bus on a city street, a fender-bender that appears to be the first time one of the tech company's vehicles caused an accident. The collision occurred on Valentine's Day and Google reported it to California's Department of Motor Vehicles in an accident report that the agency posted Monday, Feb. 29. (AP Photo/Tony Avelar, File)

A self-driving car being tested by Google struck a public bus on a Silicon Valley street, a fender-bender that appears to be the first time one of the tech company's vehicles caused a crash during testing.

Google accepted at least some responsibility for the collision, which occurred on Valentine's Day when one of the Lexus SUVs it has outfitted with sensors and cameras hit the side of the bus near the company's headquarters in Mountain View, California.

No one was injured, according to an accident report Google wrote and submitted to the California Department of Motor Vehicles. It was posted online Monday.

According to the report, Google's car intended to turn right off a major boulevard when it detected sandbags around a storm drain at the intersection.

The right lane was wide enough to let some cars turn and others go straight, but the Lexus needed to slide to its left within the right lane to get around the obstruction.

The Lexus was going 2 mph when it made the move and its left front struck the right side of the bus, which was going straight at 15 mph.

The car's test driver—who under state law must be in the front seat to grab the wheel when needed—thought the bus would yield and did not have control before the collision, Google said.

While the report does not address fault, Google said in a written statement, "We clearly bear some responsibility, because if our car hadn't moved there wouldn't have been a collision."

Chris Urmson, the head of Google's self-driving car project, said in a brief interview that he believes the Lexus was moving before the bus started to pass.

"We saw the bus, we tracked the bus, we thought the bus was going to slow down, we started to pull out, there was some momentum involved," Urmson told The Associated Press.

He acknowledged that Google's car did have some responsibility but said it was "not black and white."

The Santa Clara Valley Transportation Authority said none of the 15 passengers or the driver of the bus was injured.

The transit agency is reviewing the incident and hasn't reached any conclusions about liability, spokeswoman Stacey Hendler Ross said in a written statement.

There may never be a legal decision on fault, especially if damage was negligible—as both sides indicated it was—and neither Google nor the transit authority pushes the case.

Still, the collision could be the first time a Google car in autonomous mode caused a crash.

Google cars have been involved in nearly a dozen collisions in or around Mountain View since starting to test on city streets in the spring of 2014. In most cases, Google's cars were rear-ended. No one has been seriously injured.

Google's written statement called the Feb. 14 collision "a classic example of the negotiation that's a normal part of driving—we're all trying to predict each other's movements."

Google said its computers have reviewed the incident and engineers changed the software that governs the cars to understand that buses may not be as inclined to yield as other vehicles.

Jessica Gonzalez, a spokeswoman for California's DMV, which regulates Google's testing of about two dozen Lexus SUVs in the state, said agency officials spoke Monday with Google but would have no comment. Under state law, Google must retain data from the moments before and after any collision.

"As far as he-said she-said, there shouldn't be any of that. It's all there," said Robert W. Peterson, an insurance law expert at Santa Clara University who has studied self-driving cars.

A critic of Google's self-driving car efforts said the collision shows the tech giant should be kept from taking onto public streets self-driving prototypes it built without a steering wheel or pedals.

Google sees that as the next natural step for the technology, and has pressed California's DMV and federal regulators to authorize cars in which humans have limited means of intervening.

"Clearly Google's robot cars can't reliably cope with everyday driving situations," said John M. Simpson of the nonprofit Consumer Watchdog. "There needs to be a licensed driver who can takeover, even if in this case the test driver failed to step in as he should have."

Explore further: Things to know about accidents involving self-driving cars

Related Stories

Things to know about accidents involving self-driving cars

May 11, 2015

A small fleet of self-driving cars is maneuvering through traffic in California using an array of sensors and computing power. Drivers are required to be along for the ride, but much of the time, they keep their hands off ...

California reveals details of self-driving car accidents

June 18, 2015

California state officials released reports Thursday detailing six accidents that involved self-driving car prototypes, reversing a policy that shielded details of how the next-generation technology is performing during testing ...

Recommended for you

Sponge creates steam using ambient sunlight

August 22, 2016

How do you boil water? Eschewing the traditional kettle and flame, MIT engineers have invented a bubble-wrapped, sponge-like device that soaks up natural sunlight and heats water to boiling temperatures, generating steam ...

Apple issues update after cyber weapon captured

August 26, 2016

Apple iPhone owners on Friday were urged to install a quickly released security update after a sophisticated attack on an Emirati dissident exposed vulnerabilities targeted by cyber arms dealers.

25 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

SamB
not rated yet Feb 29, 2016
OMG! This is horrible!
It is time to ban these dangerous unmanned behemoths.
We need legislation to force reliable, safe, and efficient humans back behind the wheel!
PPihkala
not rated yet Mar 01, 2016
Maybe here the road authority should also be held partly responsible. When cars do not fit into lanes, there will be these situations that are difficult for any controller, be it human or machine. I think the car should have activated left turning signal to indicate that it will move into lane on it's left side. And then waited that traffic on that lane will stop to make it possible.
bluehigh
not rated yet Mar 01, 2016
* think the car should have activated left turning signal *

> The car did indicate.

* And then waited ... *

> How long?

The speed differential was around 12mph. The car did not merge safely due to failing to accelerate to the flow speed of traffic. So the bus driver reasonably considered to continue straight, as the bus had right of way.

The collision avoidance system failed.

If all it takes is for a sand bag on the road to overwhelm these cars then they are worthless toys.

Maybe the bus sounded horn as a warning. Are you listening Google?

Eikka
5 / 5 (1) Mar 01, 2016
Google said its computers have reviewed the incident and engineers changed the software that governs the cars to understand that buses may not be as inclined to yield as other vehicles.


Bus drivers don't yield. There's some sort of training or job requirement to be completely mental to keep to the schedule. I can count dozens of times when the driver throws the signal right when I'm at the back wheel of the bus, and pulls in front of me without waiting for me to pass, forcing me to beeline around the front of the bus that's already jutting into the road.

It happens so much that whenever I see a bus at a stop by the road, I slow down and wait to see if maybe they would like to perhaps join the traffic, signal or not, because they will do it. Doesn't matter if it's 20 or 40 mph zone, or whose right of way it is - they just go.

Eikka
5 / 5 (2) Mar 01, 2016
If all it takes is for a sand bag on the road to overwhelm these cars then they are worthless toys.


The problem is rather that the car doesn't understand to distrust other drivers and second guess them, because it can't read the situation. It assumes that giving a signal in advance and starting to turn means other people will give way, because that's the rules.

They might, but more than often they won't, and a human driver trying to merge would understand that because they know how they themselves would behave if the situation was reversed. The incoming driver too understands how the merging driver thinks, and is able to read very subtle behavioural signals to discern whether it's safe to drive on.

The AI lacks this ability to empathize and predict the other drivers, so the actions of other drivers appear random and nonsensical.
Eikka
not rated yet Mar 01, 2016
If all it takes is for a sand bag on the road to overwhelm these cars then they are worthless toys.


The problem is rather that the car doesn't understand to distrust other drivers and second guess them, because it can't read the situation. It assumes that giving a signal in advance and starting to turn means other people will give way, because that's the rules.

They might, but more than often they won't, and a human driver trying to merge would understand that because they know how they themselves would behave if the situation was reversed. The incoming driver too understands how the merging driver thinks, and is able to read very subtle behavioural signals to discern whether it's safe to drive on.

The AI lacks this ability to empathize and predict the other drivers, so the actions of other drivers appear random and nonsensical.
Eikka
5 / 5 (1) Mar 01, 2016
Worse still, when it comes to empathy and reading other drivers, the Google Car is sending mixed signals.

For example, the rider may be looking behind and with that behaviour signaling the bus driver, "I see you, I will yield to you", while the car is doing the complete opposite because it assumes giving the signal and starting to move signals the bus "I'm not yielding, slow down".

So the bus driver assumes the human is driving, and will behave according to human psychology, but the AI is driving and disregarding human psychology. Result: the bus clips the car.
antigoracle
not rated yet Mar 01, 2016
Google sounds like a parent making excuses for their teenager. I don't know about the US, but where I am, sandbags don't force SUVs to slide to the left. Perhaps they need to put some Google glasses on that car.
Protoplasmix
5 / 5 (1) Mar 01, 2016
I think it's a safe bet that, had the bus been similarly equipped and operating in self-driving mode, it would have yielded and there would have been no accident. Buses, the record shows, too often don't even yield to pedestrians...
Cave_Man
not rated yet Mar 01, 2016
So why is it that whenever I get into an accident blame is leveled either squarely on me regardless of who's fault it was or whether the accident was actually avoidable or on everyone involved because the police are too fucking lazy to care....?????

From every cop i've ever talked to: "Someone has to be at fault in an accident!"
rgw
5 / 5 (3) Mar 02, 2016
The Google car was certainly NOT texting, phoning or eating as the cause of this earth shattering impact.
antialias_physorg
5 / 5 (5) Mar 02, 2016
The thing a lot of you guys are missing is this part:
Google said its computers have reviewed the incident and engineers changed the software that governs the cars to understand that buses may not be as inclined to yield as other vehicles.

When you make a mistake on the road you learn (or maybe even not that). When this type of car learns from a mistake then EVERY other autonomous car out there learns it as well.

Add to that that what was learned doesn't disappear ... unlike when a human driver hands in his driving license either because he's to old, dies, has to due to too many infractions or just doesn't want or can't afford to drive anymore.

That's a pretty substantial advantage.

Certainly when these cars become ubiquitous places like the mentioned roadblock will be marked as a matter of course (or put in an accessible database), so that cars can route around the problem early.

KBK
5 / 5 (1) Mar 03, 2016
That's the thing about Volvo station wagons and Honda civics.

The majority of Volvo wagons get rear ended due to the cautiousness and slowness of their almost always mature or elderly drivers.

The majority of Honda civic sedans suffer front end accident damage due to being predominantly driven by careless and overly speedy teenagers.

The classic accident is the Volvo wagon rear ended by the civic sedan.

Which is kinda what we have here, in it's own way.
Captain Stumpy
5 / 5 (1) Mar 03, 2016
engineers changed the software that governs the cars to understand that buses may not be as inclined to yield as other vehicles
@AA_P
besides being heavier, slower to react (physics is a bitch) and already had the right of way... and mostly NOT ABLE to react like a car

now add the fact that most humans use the same excuses on the road when they violate the right of way to a larger vehicle and get hit (the BUS/Truck/Semi/Tank/Steam-roller etc should have yielded to ME)

why would anyone attempt to argue that this is a fault of the computer program made by humans?

just wondering, really...

From every cop i've ever talked to
@cave_man
it is required for there to be a fault on paper (proving it is what courts and labs do)
when a cop blames everyone, it is because of the lack of clear evidence of a singular violator, and thus it is left to the lab/court or to you to contest the ticket or fault (which you should do if it is not your fault)

antigoracle
not rated yet Mar 03, 2016
The car's test driver—who under state law must be in the front seat to grab the wheel when needed—thought the bus would yield and did not have control before the collision

Failure was due to a faulty HPU.
Eikka
5 / 5 (2) Mar 04, 2016
When this type of car learns from a mistake then EVERY other autonomous car out there learns it as well.


That's a question of what is actually being learned, and whether there is anything to be learned.

If the reason to the accident was an interplay between the two vehicles in a context that is beyond the comprehension of the AI, then there's no hard rule it could "learn" in order to avoid the same situation later, and attempts to codify such rule will be counter-productive. In this case, the car would become even more hesitant to merge in general, and that would cause traffic jams.

It's like the story of the monkey trainer: whenever the monkey made its business indoors, the owner would push its nose into the mess and throw it out the window as a punishment. Eventually, the monkey would take a dump on the floor, poke it with his nose, and jump out of the window all by itself.
Eikka
5 / 5 (2) Mar 04, 2016
why would anyone attempt to argue that this is a fault of the computer program made by humans?


Because it is?

Of course we are -actually- blaming the designers of the program for not making it good enough, because the overall point is that the programmers don't have a well-enough solution to the question that we should apply it.

We don't have a general AI that could actually drive a car. We have just a specialist program that employs the least possible complexity to the issue, and whenever the programmers find it lacking they add an exception or special amendment like "If you see a bus, don't merge in front of it".

But the amount of amendments and exceptions you have to add in order to make a perfectly dumb program navigate the real world reliably and safely is too large. The real world is just filled with exceptions to the rule - what they'd need is an AI that understands what its doing rather than just react to disturbances.
antialias_physorg
5 / 5 (3) Mar 04, 2016
etc should have yielded to ME)

That's something that has always bugged me (and which surprises a lot of people): While the rules of the road state very clearly what you should so there's always the proviso that "safety comes first"
(The surprise occurs when people get into an accident and then are judged 'partially responsible'...which happens in a lot more cases than one would think)

it is because of the lack of clear evidence

Which is another bonus with autonomous vehicles. They log everything they do. So the amount of BS told by the drivers will be reduced to a minimum (a fact the courst will be all too happy with I'm betting)
Eikka
5 / 5 (1) Mar 04, 2016
I like the comment made by one AI researcher, I forget his name but he was involved in the analog artifical life "beam bots" that aren't programmed but rather exhibit emergent behaviour.

He said life is trying to create order out of randomness, whereas AI researchers are trying to re-create randomness out of order.

It's a different approach to intelligence. One is coming top down from the general to the specific, whereas the other is trying to build up from the specific to the general. The latter is an uphill battle because you can't actually do it - just as you can't make a mathematical formula to compute a random number - because the whole idea is contradictory in terms.
Eikka
not rated yet Mar 04, 2016
While the rules of the road state very clearly what you should so there's always the proviso that "safety comes first"


Depends. Sometimes it's "red means stop, green means go, yellow means go really fast". There's no hard and fast rule to driving. See for example traffic in India - it's a complete mess, yet it seems to work. Rather, it seems very rigid systems of rules lead to worse driving because people insist that rules are to be followed.

There's many different takes on for example, how to turn left. Do you merge at the divider and turn in front of the other car, or do you make a sort of ballet turn behind the car... both work, and sometimes people forget and do the other thing, and most of the time the other driver just gets it and goes along, and there's no accident.

Which is another bonus with autonomous vehicles.


Well, there's also dashcams.
Captain Stumpy
5 / 5 (1) Mar 04, 2016
Depends
@Eikka
not really. the rules of the road are based upon safety and the application is the "courtesy" part of it
the reason there are places where chaos reigns (India, Iraq, etc) is because of the refusal to accept the road laws combined with cultural influences (largest car gets right of way... armored armed vehicles always have right of way, etc)

with people, there is always going to be a little delusion mixed in too... in the US this is seen around any large heavy vehicle. if a Semi/dump-truck/tanker/fire truck/BUS gets into the mix, people tend to wrongly assume that said vehicle is capable of stopping/starting/maneuvering like their own vehicle (or most typical vehicles NOT CDL licensed)... this is a fallacy. not that it isn't well known that they CAN'T react the same way... but because in the heat of the moment, people don't even typically consider that simple fact

LOVED the AI quote, BWT...
Captain Stumpy
5 / 5 (1) Mar 04, 2016
there's always the proviso that "safety comes first"
@AA_P
yep... that should be the initial concern for anyone driving defensively...
problem is... people tend to NOT drive defensively. more like "offensively" (in more ways than just aggression, too! LOL)

as noted above to Eikka
it is the delusion that takes over and the "heat of the moment" decision where they don't consider logic before they make a move
it is not to say that this is always a bad thing... it is just indicative of the aggressive nature of their driving and past training
the bulk of our driving is done without seriously thinking about it... so the training and experience comes in here, big time! and if you trained and always drive defensively, you tend to be safer; whereas if you think you're one of the Dukes of Hazzard... well... you know!

Skepticus
5 / 5 (1) Mar 05, 2016
All this boil down to this: Humans drive with their ego and emotion in addition to logic (rules). Autonomous vehicle drive only with the last one. Programs for AV should attempt to take human ego and road rage into account if possible.
jimbo92107
5 / 5 (1) Mar 06, 2016
The biggest problem with self-driving cars is interacting with pushy, sloppy, distracted humans. The AI has to be a better driver than almost all humans are, or else we automatically blame the AI. Fortunately, Google's engineers are doing exactly that - making an AI that is a much better driver than humans.
tear88
not rated yet Mar 06, 2016
Sometimes it's "red means stop, green means go, yellow means go really fast".


That's almost the only thing I remember from Starman. He learned from the human woman.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.