Google car crash—who's to blame when a driverless car has an accident?

March 4, 2016 by Joseph Savirimuthu, The Conversation
Credit: Shutterstock

Luckily no one was injured when one of Google's self-driving cars recently crashed itself into a bus as it pulled out at a junction. The car was only travelling at 2mph, after all. The company has admitted it bore "some responsibility" for the accident because the test driver (and presumably the car) thought the bus would slow down to allow the car to pull out.

Google has now redesigned its algorithms to account for this, but the incident raises the key question of just who is responsible in the eyes of the law for accidents caused by . Is it the 's owner, its manufacturer or the software maker? Who would be taken to court if charges were brought? And whose insurance company would have to pay for the damage?

Most modern cars have some technology that operates without human intervention, from air bags and anti-lock brakes to cruise control, collision avoidance and even self-parking. But very few cars have full autonomy in the sense that they make their own decisions. A human driver is usually still in control – although this assumption is increasingly difficult to maintain as advanced driver assistance technologies, such as electronic stability controls, enable to retain control of the vehicle when otherwise they might not.

Driver and company negligence

As things stand, the law still focuses specific car regulations on human drivers. The international Vienna Convention on Road Traffic gives responsibility for the car to the driver, saying "[e]very driver shall at all times be able to control his vehicle". Drivers also have to have the physical and mental ability to control the car and reasonable knowledge and skill to prevent the car harming others. Similarly, in UK law the person using the car is generally liable for its actions.

Don’t expect it to stop. Credit: Shutterstock

But following an accident, legal liability can still depend on whether a collision is due to the negligence of the human driver or a defect in the car. And sometimes, it could be due to both. For example, it may be reasonable to expect a driver to take due care and look out for potential hazards before engaging a self-parking function.

Driverless car technologies come with a warning that they are not insulated from software or design faults. But manufacturers can still be held liable for negligence if there is evidence that an accident was caused by a product defect. Legal precedents for corporate negligence have existed in the UK since 1932, when a woman successfully sued the makers of a bottle of ginger beer containing a dead snail after she drank from it and fell ill.

We have come a very long way since the 1930s. Legislation such as the Consumer Protection Act 1987 now provide a remedy for people who buy defective products. In the case of driverless vehicles, this can extend not just to the car manufacturer but to the company that programmes the autonomous software, too. Consumers don't need to prove the company was negligent, just that the product was defective and caused harm.

However, while proving this for components such as windscreen wipers or locks isn't too hard, it is more complicated to show software components are defective and, more importantly, that this has led to injury or harm. Establishing liability can also be difficult if there is evidence the driver has interfered with the software or overridden a driver assistance functionality. This is particularly problematic where advanced technologies enable driving to effectively be shared between the car and the driver. Product manufacturers also have specific defences, such as the limits of scientific knowledge preventing them from discovering the defect.

Duty of care

When it comes to the driver's responsibility, current law requires drivers to take the same amount of care no matter how technologically advanced the car is or their level of familiarity with that technology. Drivers are expected to demonstrate reasonable levels of competence and if they fail to monitor the car or create a foreseeable risk of damage or harm they are in breach of their duty of care. This implies that without a change in the law, self-driving cars won't allow us to take our eyes off the roads or take a nap at the wheel.

The current law means that if a self-driving car crashes then responsibility lies with the person that was negligent, whether that's the driver for not taking due care or the manufacturer for producing a faulty product. It makes sense for the driver to still be held responsible when you consider that autonomous software has to follow a set of rational rules and still isn't as good as humans at dealing with the unexpected. In the case of the Google crash, the car assumed that the bus driver was rational and would give way. A human would (or should) know that this won't always be the case.

Explore further: Google self-driving car strikes bus on California street

Related Stories

Ford CEO looks to autonomous cars, sharing economy

February 22, 2016

Ford CEO Mark Fields says the 112-year-old company is tripling its investment in new technologies that will ultimately lead to self-driving vehicles—but will keep making cars for drivers who want to keep their hands on ...

Recommended for you

Technology near for real-time TV political fact checks

January 18, 2019

A Duke University team expects to have a product available for election year that will allow television networks to offer real-time fact checks onscreen when a politician makes a questionable claim during a speech or debate.

Privacy becomes a selling point at tech show

January 7, 2019

Apple is not among the exhibitors at the 2019 Consumer Electronics Show, but that didn't prevent the iPhone maker from sending a message to attendees on a large billboard.

China's Huawei unveils chip for global big data market

January 7, 2019

Huawei Technologies Ltd. showed off a new processor chip for data centers and cloud computing Monday, expanding into new and growing markets despite Western warnings the company might be a security risk.


Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Mar 04, 2016
Stop trying to sensationalize the headlines all the time! A hit at 2 mph is hardly a crash, and is in reality just a bump.
not rated yet Mar 04, 2016
This is where all falls apart, legal liability.
To shift that on the "driver"?( car passenger) will require to have means to instantly control the car(steering wheel, etc) and to keep the passenger always engaged, focus always on the road. In other words you still need a driver, hands on the wheel , ready any second to react. So much for driver less cars…

If that liability rests with the producing company, they will be out of business with first major accidents – which no doubt, will happen. And it won't matter if the fault lies with other participants in the traffic, there will be always that question floating around, "could that had been avoided if a real person behind the wheel?" And, if a driver is guilty, they can be jailed, but if a company is behind the wheel, what do you do?

Look, they didn't got around accept to make trains "drive-less", where you don't have much of a "driving" to do, other than go and stop. Cars? Will be a long, long shot.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.