Regulators get input—sort of—on self-driving car rollout (Update 3)

January 28, 2016 byJustin Pritchard
California wrestles with making self-driving cars public
This May 13, 2015 file photo shows Google's new self-driving car during a demonstration at the Google campus in Mountain View, Calif. Regulators puzzling through how to give Californians safe access to self-driving cars of the future will hear from Google and other companies that want the state to open the road to the technology. The California Department of Motor Vehicles will hold a hearing Thursday, Jan. 28, 2016, at California State University, Sacramento. (AP Photo/Tony Avelar, File)

California regulators deciding how to permit the future rollout of self-driving cars were told Thursday by consumer advocates that their cautious approach was right on, and by companies developing the technology that the current course will delay deployment of vehicles that promise huge safety benefits.

The state's Department of Motor Vehicles heard the comments at a workshop as it wrestles with how to keep the public safe as the imperfect technology matures—but not regulate so heavily that the agency stifles development of the vehicles.

The agency sought suggestions of possible changes to a draft of precedent-setting regulations it released last month. Those regulations will govern how Californians can get the cars once companies move beyond their current testing of prototypes.

Because California has been a hotbed for the development and regulation of the technology, what happens in the state has ripple effects nationally.

What the DMV had hoped would be a technical discussion Thursday about legal language instead drifted toward broad statements about the technology's merits.

Most vocal were advocates for the blind—a group that has not been central to the regulatory debate. Several argued the technology could change their lives, and the agency should not get in the way.

"Please don't leave my family out in the waiting room," said Jessie Lorenz, who is blind and relies on public transit to get her 4-year-old daughter to preschool. Lorenz would prefer to use a self-driving car for that—or even a "spontaneous road trip."

She said she has taken a ride in a self-driving car that Google Inc. has been developing, "and it was awesome."

DMV attorney Brian Soublet said the agency appreciates the potential benefits for disabled people, but its focus has to be on the safety of the entire motoring public.

Google wants California to clear the road for the technology—and has expressed disappointment in the DMV's draft regulations, which say self-driving cars must have a steering wheel in case onboard computers or sensors fail. A licensed driver would need to sit in the driver's seat, ready to seize control.

"We need to be careful about the assumption that having a person behind the wheel" will make driving safer, Chris Urmson, the leader of Google's self-driving car project, told the agency.

Google has concluded that human error is the biggest danger in driving, and the company wants to remove the steering wheel and pedals from cars of the future, giving people minimal ability to intervene.

Urmson said that if the draft regulations are not changed, Google's car would not be available in California. While Google has been testing on roads here for several years—with trained safety drivers behind the wheel, just in case—it might deploy cars without steering wheels in Texas, where regulators hailed the technology when Google began testing prototypes there last summer.

California's DMV is still months away from finalizing any regulations.

Under the draft framework, an independent certifier would need to verify a manufacturer's assurances that its cars are safe. Google and traditional automakers want manufacturer self-certification, the standard for other cars.

Once a company receives that verification, manufacturers would receive a permit for three years. Consumers could lease the cars, but manufacturers would be required to keep tabs on how safely they are driving and report that performance to the state. Drivers would need special, manufacturer-provided training, and then get a special certification on their licenses.

If a car breaks the law, the driver would be responsible.

John Simpson of the nonprofit Consumer Watchdog commended the DMV on Thursday "for putting safety first. I think you got it exactly right" in the draft, he said.

Earlier this month, federal officials announced an aggressive plan to get the technology to the public's hands sooner than later.

In written guidance, the National Highway Traffic Safety Administration, projected that "fully automated vehicles are nearing the point at which widespread deployment is feasible."

It remains unclear just how the bullish federal approach will affect California's regulatory process.

Neither Google nor traditional automakers have said they think the cars are ready yet, but at least a dozen companies are developing the technology and nearly as many have permission to test in California. Google has suggested a model could be ready for limited use sooner than the public realizes.

Several times during Thursday's workshop, DMV officials urged commenters to offer specific changes to the draft regulations, sometimes in reaction to comments that the regulations fell short.

Speaker Curt Augustine of the Alliance of Automobile Manufacturers said his organization did not agree with the DMV's third-party certification requirement.

DMV attorney Soublet asked for proposed fixes, invoking a saying his father told him: It only takes one wrecking ball to demolish a house, but a whole crew to build one.

The agency has been working on regulations for testing and now deployment for nearly three years—and regulations on deployment were supposed to be final a year ago.

Explore further: Eye on safety, California sets rules for self-driving cars

Related Stories

Eye on safety, California sets rules for self-driving cars

December 16, 2015

California regulators have unveiled a roadmap that would let consumers begin using self-driving cars, though manufacturers would have to prove the emerging technology is safe before a licensed driver could get chauffeured ...

Recommended for you

Finnish firm detects new Intel security flaw

January 12, 2018

A new security flaw has been found in Intel hardware which could enable hackers to access corporate laptops remotely, Finnish cybersecurity specialist F-Secure said on Friday.

16 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

dogbert
1 / 5 (5) Jan 28, 2016
There have been scattered collisions, nearly all involving Google cars. Those collisions have been minor, and Google says each has been caused by other drivers.


It is understandable that Google whats to present its autonomous cars as better drivers than human drivers, but the claim that all the accidents are caused by human drivers is simply not true. Google cars have many more rear end collisions than human drivers do. In many areas, a rear end collision is automatically charged to the driver striking the rear of the car, but in fact, when you have many more collisions of any type than human drivers, you cannot rationally say that all the accidents are human caused.

dogbert
1 / 5 (2) Jan 28, 2016
If a car breaks the law, the driver would be responsible.


Even discussing liability issues is difficult with autonomous cars. The above statement that the driver of the autonomous car would be responsible implies that the manufacturer would be responsible. In fact, the owner and/or operator of the autonomous vehicle would be responsible for liability.
winthrom
1 / 5 (3) Jan 28, 2016
What is not said here is that autonomous vehicles are best used in selected road venues. City driving is much more complex than highway driving.

A journey is currently composed of three parts: (1) Start (passenger embarks to parked vehicle, vehicle enters local road system, vehicle maneuvers to travel system) (2) Middle (vehicle traverses travel system such as highways) (3) End (vehicle maneuvers local road system to departure point, passenger debarks parked vehicle). The Start and End parts are currently better controlled by human drivers.

The Middle is a convenience opportunity for autonomous vehicles. The autonomous vehicle has another opportunity at Start and End of journey in that it can un-park and later park itself. In most other situations, autonomy is too inflexible to see every possible outcome, and will thwart human drivers in other vehicles. Unless autonomous vehicles get special privilege laws, full autonomy is unlikely.
antigoracle
1 / 5 (2) Jan 28, 2016
Input from one big player, the insurance industry, is missing and I await their take on this.
indio007
3 / 5 (1) Jan 28, 2016
If a car breaks the law, the driver would be responsible.


Even discussing liability issues is difficult with autonomous cars. The above statement that the driver of the autonomous car would be responsible implies that the manufacturer would be responsible. In fact, the owner and/or operator of the autonomous vehicle would be responsible for liability.

This is a fight over money. It's going to be for a long time.
State vs Auto Manufacturers vs the People vs Insurance companies vs Ambulance chasers

this is a scramble over who is going maintain a grip on the legacy revenue stream that will disappear.
Eikka
1.8 / 5 (5) Jan 29, 2016
From the typical driver's point of view, an autonomous car isn't safer than driving yourself, because of the difference in concept between the median and the mean.

A company like Google can argue in court that the cars reduce accidents because they drive better and have a lower rate of technical malfunctions than the worst minority of human drivers in ill maintained vehicles, who cause the majority of the accidents, and therefore if all humans were replaced by robots there would be less accidents. For example, the robot will never be guilty of a DUI.

That however means that the cars will actually perform considerably worse than most people because the bar is set too low, and therefore for most people they would actually increase the number of accidents and incidents.

This sort of thing is typical of new technologies in their first-generation: they're grossly oversold and overhyped to the public by companies with a profit motive rather than public good in mind.
Eikka
2.3 / 5 (6) Jan 29, 2016
There's just this great illusion that computers are superior to man because they're faster, when in reality they are faster at the expense of being simpler, cruder, dumber.

It's incredibly hard and requires tons of hardware to get a computer to e.g. reliably identify a bird in a photo, without false positives or negatives, and that's just a categorization problem. Trying to get the machine to understand what a bird means is completely out of reach for today's AI.

So what you actually get with autonomous cars and the like is the absolute minimum intelligence they can get away with - because it costs money and time, and may be even practically impossible to implement if they made it any smarter.

The companies basically cheat and cut every possible corner just to get the car going - like Google pretending their cars drive themselves and make choices when actually they're simply following a virtual line drawn by humans.

This is not better. It's just incredibly fragile.
Eikka
1.8 / 5 (5) Jan 29, 2016
If people understood that the real level of intelligence in a self-driving car is akin to a worm following a trail of chemicals around a petri dish, they wouldn't want to step into one.

A nematode worm Caenorhabditis elegans is about 1 mm in length and has an average of 959 cells of which 302 are neurons - one of the few organisms which we've mapped completely - and it takes a supercomputer to replicate its behaviour.

http://www.artifi...openworm
the somatic nervous system contains 6,393 chemical synapses, 890 gap junctions, and 1,410 neuromuscular junctions.


That's a creature whose only purpose in life is to wiggle around and hope it bumps into something edible. It's far simpler than any robot would need to be to survive the human society and the traffic system as an environment.
greenonions
5 / 5 (4) Jan 29, 2016
Eikka
If people understood that the real level of intelligence in a self-driving car is akin to a worm following a trail of chemicals around a petri dish, they wouldn't want to step into one.


If that were true - I would not want to step into one. But of course it is not true. http://www.techno...les.html

So I will be very happy to step into a safer than human autonomous car.
viko_mx
1 / 5 (5) Jan 29, 2016
Like a person with free will I prefer to have personal control on my life and my decisions. Autonomous car are dangerous in many aspects. Тheir main drawback is that seize the people sovereignty and control over situations on the road.
greenonions
5 / 5 (2) Jan 29, 2016
viko
Autonomous car are dangerous in many aspects
But safer than humans. You live in the past if you want - but others of us like progress - that is why we read physorg. Just expect to get sued if you kill someone because you are texting and driving - as many people do today.
rrrander
2.7 / 5 (3) Jan 30, 2016
Lib progressives don't want you in cars, they want you on sardine can buses, trains.
viko_mx
1 / 5 (3) Jan 30, 2016
"But safer than humans. "

Not exactly.

"You live in the past if you want - but others of us like progress "

I am living for the future as every other person. What you see as progress I see as a regression and loss of sovereignty.

The freedom within the moral and physical laws of the Creator who maintain the order in the universe, is the crown of life.
viko_mx
1 / 5 (3) Jan 30, 2016
Life would not be possible without the love and truth, but only free persons can love.
Lord_jag
not rated yet Jan 30, 2016
I wish most electric cars weren't designed to be as disgusting aesthetically as possible. Why can't Google make a car that looks nice?
greenonions
5 / 5 (1) Jan 30, 2016
viko
What you see as progress I see as a regression and loss of sovereignty.
As is your right. It does not make you right. If the facts show that autonomous cars are safer than human driven cars - then we should adopt them. No one is touching your sovereignty - we are simply trying to build a better, and safer world. Google is in a fight with California regulators right now - and I support the regulators - who are wanting to go slowly - and make sure the systems are implemented with caution. http://www.usatod...7447672/
I have a friend who is blind. She has to walk a mile - across front lawns (no sidewalks here in Oklahoma City) - to catch the bus to work. Try doing that blindfold - after an ice storm. Does she have a right to progress, or do you have a right to deny her an amazing step towards greater independance? Autonomy is not just about you.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.