Consumer Reports says Tesla should drop Autopilot name

Consumer Reports says Tesla should drop Autopilot name
In this Sept. 15, 2015, file photo, a Tesla Model S is on display on the first press day of the Frankfurt Auto Show IAA in Frankfurt, Germany. Consumer Reports magazine is calling on electric car maker Tesla Motors to change the name of its Autopilot semi-autonomous driving system and to disconnect the automatic steering feature after a fatal crash in Florida. The magazine says in a statement that calling the system Autopilot promotes a dangerous assumption that Teslas can drive themselves. (AP Photo/Michael Probst, File)

Consumer Reports said Thursday that Tesla Motors is misleading car owners by calling its semi-autonomous driving system "Autopilot," potentially giving them too much trust in their car's ability to drive itself.

The influential magazine said Tesla should drop the Autopilot name and disconnect the automatic steering system until it's updated to make sure a driver's hands stay on the wheel at all times. The system currently warns after a few minutes of their hands being off the wheel.

In an e-mail, a Tesla spokeswoman said the company has no plans to change the name, and that data it collects show drivers who use Autopilot are safer than those who don't.

With its statement, Consumer Reports joined a debate over autonomous driving technology that escalated after authorities revealed that Joshua Brown, 40, of Canton, Ohio, died in a May crash in Florida with the Autopilot on in his 2015 Model S. The system didn't detect a tractor-trailer that had turned in front of the in bright sunshine, and Brown also failed to react.

The National Highway Traffic Safety Administration is investigating the wreck and the functioning of the Autopilot system. After the Brown crash, critics accused Tesla of giving drivers access to a system that wasn't ready, while supporters contended the company was improving automotive safety.

Tesla's Autopilot system uses cameras, radar and computers to detect objects and automatically brake if the car is about to hit something. It also can steer the car to keep it centered in its lane. The company says that before Autopilot can be used, drivers must acknowledge that it's an "assist feature" that requires both hands on the wheel at all times. Drivers also must be prepared to take over at any time, Tesla has said.

Yet Laura MacCleery, Consumer Reports' vice president of consumer policy, said naming the system Autopilot gives drivers a false sense of security. Autopilot, she wrote, can't actually drive the car, but it lets consumers keep their hands off the steering wheel for minutes at a time.

"We're deeply concerned that consumers are being sold a pile of promises about unproven technology," she said in a statement.

Earlier this week Tesla disclosed that a Model X SUV crashed early Saturday in Montana while the driver was using the autosteer feature on a two-lane road, which is not recommended by the company. Tesla, which gets information from its cars over the internet, said the car warned the driver at least once to place his hands on the before it crashed.

MacCleery called on the Palo Alto, California, company to disable automatic steering until it updates the computer program to ensure a driver's hands are on the wheel.

Consumer Reports also said Tesla should issue clearer guidance on how Autopilot is used and what its limitations are. Tesla CEO Elon Musk has said he'll provide more thorough guidance in a blog posting, and the spokeswoman said that was coming.

Tesla released Autopilot last fall and says the system is still in a "public beta," or testing phase. Critics have complained that Tesla is using drivers as "guinea pigs"—a sentiment echoed by Consumer Reports.

Tesla said Autopilot underwent millions of miles of internal testing and is updated constantly. "We will continue to develop, validate, and release those enhancements as the technology grows," the spokeswoman said.

Not all magazines that test cars are critical of Tesla and Autopilot. Road and Track said on its website this week that Autopilot is a technological achievement that should make America proud. Autopilot is at least as safe as human drivers on the highway, in a car that doesn't use gasoline and performs like a sports car, the magazine said.

Consumer Reports has expressed concerns about Autopilot before. During a November podcast, Jake Fisher, auto testing editor, said the system provided an added layer of confidence. But he was surprised that he could take his hands off the wheel for 2 ½ minutes at a time and browse the web on its dashboard screen while driving.

In February, Consumer Reports urged Tesla to change a feature within Autopilot known as Summon, which lets owners start cars and move them out of a garage or parking spot automatically using a key fob or a smartphone. The magazine found that users couldn't stop the cars right away if they pressed the wrong button on the key fob. It also found that the cars kept moving when the smartphone app was closed. Tesla responded with a software update that limited the Summon feature to smartphones and required the user to keep a finger on the phone screen when the car was being summoned.


Explore further

Tesla won't disable Autopilot despite accidents: report

© 2016 The Associated Press. All rights reserved.

Citation: Consumer Reports says Tesla should drop Autopilot name (2016, July 14) retrieved 19 July 2019 from https://phys.org/news/2016-07-consumer-tesla-autopilot.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
10 shares

Feedback to editors

User comments

Jul 15, 2016
Drivers also must be prepared to take over at any time, Tesla has said.


Which they can't, according to studies. It takes too long for the switchover if the driver is in any ways distracted, such as fiddling with their phone.

Autopilot is at least as safe as human drivers on the highway, in a car that doesn't use gasoline and performs like a sports car, the magazine said.


Those sort of statements are highly irresponsible, considering how stupid the autopilot actually is.

Again, the failure in this case wasn't that the car failed to -see- the obstacle, because it had a multitude of sensors that all reported it, but a failure of the computer to -understand- the obstacle for what it was. The AI just has zero situational awareness, zero memory, and it's simply reacting to rudimentary pre-programmed cues which lead to the failure.

It's patently unsafe because it's practically impossible to make an all-covering set of rules that would take care of every detail

Jul 15, 2016
Programming a car to drive safely is actually the same problem as computer vision in general. Given a picture of an umbrella, the computer "sees" just ones and zeroes, and it has to do some pattern matching trick to percieve it.

So the traditional way to do it is simply to load the computer up with two billion pictures of all possible umbrellas, opened and closed, partially closed, partially broken, upside down and filled with water... red, green, blue umbrellas, multicolored umbrellas etc. until the computer can find at least one example that matches with high confidence to say "that's an umbrella".

Immediately you see what the problem is when the computer has to identify more than umbrellas. Likewise, it's very much impossible to program a car with such an exhausting amount of data to identify its surroundings, however they might be

The AI needs to develop an actual understanding, but that's a hard problem the computer scientists have been wrestling with since the 70's

Jul 15, 2016
So the problem is thus: you got algorithms that are maybe 90% accurate at identifying particular pre-programmed things like "a billboard, a car, a road sign" from the noisy sensor data, and you make it drive a car.

99.999% of the time it's going to drive just fine because most of the time nothing unusual happens or the driver takes over before anything gets to happen, and the simple rules are perfectly sufficient: stay on the road between lane markers, keep to the speed limit and don't crash into (what look like) obstacles.

Fine. Then a tractor-trailer turns in front of you and the computer thinks it's a hanging road sign, attempts to limbo under the trailer and decapitates the driver.

So the company programmers add a new rule to detect low hanging obstacles, and the system works fine for a while, but then another driver gets killed over a different reason - and there are millions of possible exceptions and anomalies.

It's just a neverending game of whack-a-mole

Jul 17, 2016
Very well said. This is part of a culture of aggressively insisting that the future is here now, when it really isn't. I fully expect fully autonomous cars will be a reality... sometime in the mid-30's. Trying to make a car fully autonomous... no steering wheel, no controls... with weak AI is not credible. They are moving toward a fully autonomous car the way a monkey climbing a tree is moving toward the moon. It will take strong AI to drive a car, and it won't be the Tesla or any other car maker who will develop strong AI.

I'm also concerned that this the future is here now culture completely ignores human psychology. They try to reprogram the person to fit their technology, for no other reason than the gosh wowness of it all.


Jul 17, 2016
Case in point: Their operational definition of driver attention is having hands on the wheel. This is not valid. The human brain is a mechanism and works as such. Attention is produced in a mechanical way. Look up "salience." Attention goes away when there are no salient inputs. Eyes open, hands on the wheel - attention goes away. You can't reprogram a human away from this reliably. One human at certain times, alright. But not a large group consistently.

Critics are focusing on drivers being distracted or getting too comfortable and putting their attention elsewhere. But even if you have a mechanism that forces the driver to keep hands on the wheel, or even eyes open looking forward, attention is still going to go away without the stimulus that comes from physically and continuously driving the car.

Tesla and others are completely ignoring the science of cognitive psychology.

Jul 17, 2016
This hybrid model of unready technology and non-attentive drivers is wrong headed. They should simply wait until the tech is fully ready and there is no hybrid system.

Jul 18, 2016
Your writer says in the article, "With its statement, Consumer Reports joined a debate over autonomous driving technology ..."

But it is NOT autonomous. That is Consumer Reports' point about using Autopilot. People are mislead by the word. In aviation, where we mostlly get the concept, flight crews are trained to know that it doesn't mean autonomous, but that the equipment **assists** the crew by taking care of some aspects of flying.

Once the writer uses autonomous without 'semi' in front of it, CR's point is confirmed.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more