How will self-driving cars affect your insurance?

How will self-driving cars affect your insurance?
Credit: Shutterstock

Mark Molthan admits he wasn't paying attention when his car crashed into a fence, leaving him with a bloody nose, according to a news report. The Texan had left control of his Tesla Model S to its autopilot system, which failed to turn at a curve and instead drove the car off the road. But Tesla, like other car manufacturers, stresses its self-driving technology is there just to assist drivers, who should remain ready to take over at any time.

One of the big questions about cars with self-driving technology is who's to blame when something goes wrong. The driver in this case may have reportedly admitted he was at fault. But that hasn't stopped his from requesting a joint inspection of the written-off car, which raises the prospect the firm may sue Tesla to pay for the damage.

Insurance firms will always try to prove they shouldn't have to pay for an accident. And software bugs in self-driving cars could create a new reason manufacturers might have to shoulder the cost of crashes. Yet if remain legally responsible for a car even as technology encourages them to take their eyes off the road, will manufacturers be able to avoid blame, leaving insurance companies to recoup their costs through higher premiums?

The British government is already hoping to address this issue with a new piece of legislation to be introduced in autumn 2017. In anticipation of this, it is currently consulting the public and experts about how driverless cars should be insured in the future.

One model that could be introduced would build on the current system of compulsory insurance. But as well as every driver needing insurance, manufacturers of any car with a form of self-driving technology would also have to take out a policy to cover any liability for . The costs of this would likely be passed on to drivers through higher purchase costs.

Liability will then be determined on the circumstances of each individual accident. If the accident is caused entirely by the vehicle, it is the manufacturer's insurers who will be liable. If the accident is caused by both vehicle malfunction and driver error, then it is likely to fall on both insurers.

As with the current system of compulsory insurance, there would be a battle between insurers as to exactly who will pay for any damage or injury caused. This therefore will not make much of a difference to the driver, except for an increase in premium if they are found liable.

Other premiums

However, a different system could stop liability battles between insurers from clogging up the courts and ultimately cost drivers less. Instead of having to buy insurance for cars with self-driving technology, drivers would simply pay an extra fee on top of the cost of the or on the petrol or electricity they use to power the vehicle. This money would go into a central fund that would pay for any damage caused. This would be held by the government or (in the UK) the Motor Insurer's Bureau, which compensates victims of accidents caused by uninsured drivers funded by a similar levy on insurance premiums.

This would eventually mean drivers would have to pay less in the long run because they wouldn't be paying for insurance company costs and profits, just for the damage of accidents. A similar system is already used in New Zealand for conventional vehicles.

Either system won't have much of an effect on how much you have to pay for insurance in the meantime. And in fact, premiums will most likely fall as actually appears to make the overall risk of an accident lower – something that will surely be welcomed by all. In the future, however, we may not need driver at all. If cars become fully autonomous with no need for humans to do any driving, then the manufacturers will probably become responsible for every journey.


Explore further

Tesla won't disable Autopilot despite accidents: report

This article was originally published on The Conversation. Read the original article.
The Conversation

Citation: How will self-driving cars affect your insurance? (2016, August 23) retrieved 24 August 2019 from https://phys.org/news/2016-08-self-driving-cars-affect.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
5 shares

Feedback to editors

User comments

Aug 23, 2016
But Tesla, like other car manufacturers, stresses its self-driving technology is there just to assist drivers, who should remain ready to take over at any time.


If a car is just plowing straight in a bend, there's scant little time to do that in the first place. You'd basically have to sit there with your knuckles white just waiting for the car to make the error of trying to kill you. Might as well not and drive yourself.

Aug 23, 2016
waiting for the car to make the error of trying to kill you.

"The car killing you" or "the car trying to kill you" means that the car is conscious, and is actively, willfully trying to commit murder.
I'm curious if your choice of language is unintentional and ironic, or if you're trying to play up to whatever proso-\mechano- phobia you can elicit?

Aug 23, 2016
By the time autonomous cars are released, Americans will be required to have insurance on our purchase insurance to insure our other insurances.

Aug 27, 2016
Uh, I can see the airbags had deployed. But was he wearing a seat-belt ??

Sep 07, 2016
"The car killing you" or "the car trying to kill you" means that the car is conscious, and is actively, willfully trying to commit murder.


Murder is only if the intent is malicious. The car has no intent, malicious or otherwise, so it cannot murder.

It can however still kill you, or try to kill you. Consciousness is not necessary for behaviour. An unthinking non-conscious machine tries to behave as programmed or built - for example we say a thermostat is trying to maintain a set temperature without assuming the thermostat is conscious.

When a machine is made so its behaviour can lead to people dying, such as an autonomous car that is prone to taking a fast curve straight because its AI determines that is where it should go, it is apt to say that the car is trying to kill its passengers.

But the intent is outside of the machine - in the person who designs the machine. Whether they are malicious or simply incompetent or negligent is a different question.

Sep 07, 2016
As for the different question:

http://money.cnn....dex.html
For Musk specifically, the source says his driving force is "don't let concerns slow progress."


Eric Meadows, a former autopilot engineer at Tesla, recalls testing the feature on a Los Angeles highway in mid-2015, a few months before its release, when he was pulled over on suspicion of driving drunk. The car, he says, had struggled to handle the sharp turns while in autopilot mode.

Meadows knew he was "pushing the limits" of autopilot, but he assumed customers would push those limits too. That's why the incident worried him.

"I came in with this mentality that Elon had: I want to go from on-ramp to off-ramp and the driver doesn't have to do anything," says Meadows, who was fired from Tesla two months later for performance reasons. "The last two months I was scared someone was going to die."


And so they did.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more