Tesla driver killed in crash while using car's 'Autopilot'

June 30, 2016 by Joan Lowy And Tom Krisher
In this Monday, April 25, 2016, file photo, a man sits behind the steering wheel of a Tesla Model S electric car on display at the Beijing International Automotive Exhibition in Beijing. Federal officials say the driver of a Tesla S sports car using the vehicle's "autopilot" automated driving system has been killed in a collision with a truck, the first U.S. self-driving car fatality. The National Highway Traffic Safety Administration said preliminary reports indicate the crash occurred when a tractor-trailer made a left turn in front of the Tesla at a highway intersection. NHTSA said the Tesla driver died due to injuries sustained in the crash, which took place on May 7 in Williston, Fla. (AP Photo/Mark Schiefelbein, File)

The first U.S. fatality using self-driving technology took place in May when the driver of a Tesla S sports car operating the vehicle's "Autopilot" automated driving system died after a collision with a truck in Florida, federal officials said Thursday.

The government is investigating the design and performance of Tesla's system.

Preliminary reports indicate the crash occurred when a tractor-trailer rig made a left turn in front of the Tesla at an intersection of a divided highway where there was no traffic light, the National Highway Traffic Safety Administration said. The Tesla driver died due to injuries sustained in the crash, which took place May 7 in Williston, Florida, the agency said. The city is southwest of Gainesville.

Tesla said on its website that neither the driver nor the Autopilot noticed the white side of the trailer, which was perpendicular to the Model S, against the brightly lit sky, and neither applied the brakes.

"The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer," the company said. The windshield of the Model S collided with the bottom of the trailer.

By the time firefighters arrived, the wreckage of the Tesla—with its roof sheared off completely—was hundreds of feet from the crash site where it had come to rest in a nearby yard, assistant chief Danny Wallace of the Williston Fire Department told The Associated Press. The driver was pronounced dead, "Signal Seven" in the local firefighters' jargon, and they respectfully covered the wreckage and waited for crash investigators to arrive.

The company said this was the first known death in over 130 million miles of Autopilot operation. It said the NHTSA investigation is a preliminary inquiry to determine whether the system worked as expected.

Tesla says that before Autopilot can be used, drivers have to acknowledge that the system is an "assist feature" that requires a driver to keep both hands on the wheel at all times. Drivers are told they need to "maintain control and responsibility for your vehicle" while using the system, and they have to be prepared to take over at any time, the statement said.

Autopilot makes frequent checks, making sure the driver's hands are on the wheel, and it gives visual and audible alerts if hands aren't detected, and it gradually slows the car until a driver responds, the statement said.

Tesla conceded that the Autopilot feature is not perfect, but said in the statement that it's getting better all the time. "When used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety," the company said.

The Tesla driver was identified as Joshua D. Brown, 40, of Canton, Ohio. He was a former Navy SEAL who owned a technology company, according an obituary posted online by the Murrysville Star in Pennsylvania.

Tesla's founder, Elon Musk, expressed "our condolences for the tragic loss" in a tweet late Thursday.

NHTSA's Office of Defects is handling the investigation. The opening of the preliminary evaluation shouldn't be construed as a finding that the government believes the Model S is defective, NHTSA said in a statement.

The Tesla death comes as NHTSA is taking steps to ease the way onto the nation's roads for self-driving cars, an anticipated sea-change in driving where Tesla has been on the leading edge. Self-driving cars have been expected to be a boon to safety because they'll eliminate human errors. Human error is responsible for about 94 percent of crashes.

NHTSA Administrator Mark Rosekind is expected to release guidance to states next month defining the federal role in regulating the vehicles versus the state role, and suggesting what laws and regulations states might want to adopt. Federal officials and automakers say they want to avoid a patchwork of state and local laws that could hinder adoption of the technology.

Most automakers are investing heavily in the technology, which is expected to become more widely available over the next five years. Like the Model S, the first generation of self-driving cars is expected to be able to travel only on highways and major well-marked roadways with a driver ready to take over. But fully self-driving vehicles are forecast to become available in the next 10 to 20 years.

Musk has been bullish about Autopilot, even as Tesla warns owners the feature is not for all conditions and is not sophisticated enough for the driver to check out.

This spring, Musk said the feature reduced the probability of having an accident by 50 percent, without detailing his calculations. In January, he said that Autopilot is "probably better than a person right now."

One of Tesla's advantages over competitors is that its thousands of cars feed real-world performance information back to the company, which can then fine-tune the software that runs Autopilot.

Other companies have invested heavily in developing prototypes of fully self-driving cars, in which a human would be expected to have minimal involvement—or none at all. Alphabet Inc.'s X lab has reported the most crashes of its Google self-driving cars, though it also has the most testing on public roads. In only one did the company acknowledge that its car was responsible for the crash, when a retrofitted Lexus SUV hit a public bus in Northern California on Valentine's Day.

Explore further: Tesla to release lower-priced versions of Model S car

Related Stories

US closes investigation of Tesla battery fires

March 28, 2014

The U.S. government's auto safety watchdog has closed an investigation into Tesla electric car battery fires after the company said it would install more shields beneath the cars.

Recommended for you

Researchers find tweeting in cities lower than expected

February 20, 2018

Studying data from Twitter, University of Illinois researchers found that less people tweet per capita from larger cities than in smaller ones, indicating an unexpected trend that has implications in understanding urban pace ...

Augmented reality takes 3-D printing to next level

February 20, 2018

Cornell researchers are taking 3-D printing and 3-D modeling to a new level by using augmented reality (AR) to allow designers to design in physical space while a robotic arm rapidly prints the work.

84 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

Eikka
4 / 5 (8) Jun 30, 2016
Tesla said on its website that neither the driver nor the Autopilot noticed the white side of the trailer, which was perpendicular to the Model S, against the brightly lit sky, and neither applied the brakes.


The Tesla autopilot has a radar, a camera, and a long-range ultrasonic sensor, and it relies more on the radar and sonar because the image recognition algorithms are so primitive. That means the color or background of the trailer were irrelevant - there's no way the car didn't "see" the obstacle - the AI must have failed to interpret and react to the situation.

The driver was probably nodding off or looking somewhere else.
Eikka
2.7 / 5 (7) Jun 30, 2016
Elon Musk tweeted

Radar tunes out what looks like an overhead road sign to avoid false braking events


Sounds like a major software bug. Why would the radar believe a windshield-height object to be a road sign? Or did they not program the radar to detect things like toll booth barriers? Would the car just barrel through a railway crossing?

This is exactly what I've been saying about these things. The software is just too -dumb- to understand what it is actually looking at even if it can see everything, so it has to rely on hard coded dumb rules about what is what. Then, in order to avoid false positives and erratic over-reactions, they err on the side of false negatives and people get killed.

luke_w_bradley
3.7 / 5 (9) Jun 30, 2016
I did a little work with computer vision and radio location stuff, and I came to the same conclusion. Its easy to prove there are always ambiguous cases, cv cant be both limited and perfect. Our brains resolve these ambiguous cases by using lots of world data, acquired through previous experience, so true computer vision = AI. It will be better than humans eventually, but there's hazards in making it too simple.
javjav
2.3 / 5 (12) Jun 30, 2016
In the interest of safety, ALL vehicles and trailer modules should be equipped with transponders (independently if they have autopilot capabilities or not). Just a short range radio device transmitting GPS position, vehicle dimensions and speed vector could cost very few dollars per vehicle and it should be obligatory. Then, collision detection systems could be installed in all vehicles (even a smartphone could do it) , independently if they have autopilot or not (to at least trigger collision warnings).
When doing mass production, those devices would be much cheaper than many superfluous elements found in modern cars. Even if these devices are not perfect they could anticipate to most of the typical collisions and save a lot of lives. But instead, manufacturers are overloading on board computers with stupid functions that only provide unnecessary distractions.
antialias_physorg
4.3 / 5 (6) Jul 01, 2016
Its easy to prove there are always ambiguous cases, cv cant be both limited and perfect.

Yes. My main research work was in image segmentation. Computer algorithms handle segmentation differently from human vision. This gives them great advantages in some areas but is (still) below human abilities in others.

For example a computer vision algorithm doesn't get tired and doesn't have the very limited area of focus that humans have.
E.g. There's no reason why a cv can't work on a 360° image and pay the same amount of 'attention' to every square milimeter. This gives computers great advantage in being 'aware' of several dangerous factors at once where a human would focus on one and probably miss all the others.

On the other hand humans are better at judging context and performing a 'sanity check' (as in the case of the truck a human would notice that the back wheels shouldn't be disjoint from the front wheels - even if the sky color is identical to that of the truck)
antialias_physorg
4.3 / 5 (6) Jul 01, 2016
ALL vehicles and trailer modules should be equipped with transponders

Eventually they will when autopilot functionality becomes more ubiquitous for added safety. However, trusting an active transponder is dicey.
For that matter trusting ANY active system in a vehicle is dicey - that's why they are designed to still be controllable even if all the assist modules (ABS, ESP, power steering, assisted brakes ...) fail.

An autonomous car must be able to work without relying on others having transponders. Not all motor vehicles are maintained to spec over their lifetime.
Also there are other moving and static hazards on the road that you cannot equip with transponders or where such signals will surely be out of order (from broken down vehicles to crossing deer)
Eikka
3 / 5 (4) Jul 01, 2016
E.g. There's no reason why a cv can't work on a 360° image and pay the same amount of 'attention' to every square milimeter.


Well, there's the slight problem of fitting a TOP500 supercomputer in a Tesla, assuming they do manage to figure out algorithms that adequately emulate "attention".

An autonomous car must be able to work without relying on others having transponders.


It also must be able to work without relying on outside information (ie. programmers) telling it what is what, like the Google Car which relies on an intricate map that is scanned in an cleaned by programmers to tell the car what to see.

All the self-driving vehicles are operating on some sort of trick or shortcut that will prove inadequate in the long run.
marcush
1 / 5 (5) Jul 01, 2016
As much as I love Tesla, I have to say that the benefits of google's approach become more obvious after this. Low speed and no driver vs high speed and passengers.
antialias_physorg
3.5 / 5 (8) Jul 01, 2016
assuming they do manage to figure out algorithms that adequately emulate "attention".

What are you babbeling about? Attention simply means taking every pixel into account when doing image analysis - and not just that tiny area your eyes are focussed on as humans do.

It also must be able to work without relying on outside information (ie. programmers) telling it what is what

Now you're going completely crazy. Not even HUMANS do that.

Go back to posting on something you know anything about...which is...hmmmm...actually: what do you know anything about? From your postings over the past years I haven't yet manage to figure that one.
Eikka
3.3 / 5 (4) Jul 01, 2016
On the other hand humans are better at judging context and performing a 'sanity check'


What I think happened here is that the forward facing radar identified the side of the truck, but the radar can't tell the height of the object - only the distance to it.

So the computer switched to camera vision to ask "is it a road sign?" and the camera took a look at the top of the trailer and saw a big flat area against the sky and went "yep, it's a roadsign" and the AI determined it can limbo underneath it.

Problem is, the programmers didn't realize that something could be a "road sign" and also hang too low to safely pass under.

(as in the case of the truck a human would notice that the back wheels shouldn't be disjoint from the front wheels - even if the sky color is identical to that of the truck)


The trouble here is that the computer doesn't even try to reason like that. It just operates on a simple checklist of rules.
javjav
1.3 / 5 (6) Jul 01, 2016
@antialias I agree, but that is why I was proposing a passive detection/alarm system, not active. It would work for all kind of vehicles independently if they have autopilot or not. If they have it it will provide one more input to the computer , if not it will simply trigger a sound alarm. It is working great on planes, both in manual and autopilot mode.
Eikka
3.4 / 5 (5) Jul 01, 2016
What are you babbeling about? Attention simply means taking every pixel into account when doing image analysis


I'm saying "attention" is about separating and tracking useful information in a scene rather than just blindly including every pixel.

Concentrating and trying to apply analysis on absolutely everything, all the noise and extraneous information, in a scene makes the computer schitsophrenic because it can and will make false inferences by chance.

It has to leave stuff out, and how it does that is crucial.

- and not just that tiny area your eyes are focussed on as humans do.


That's not actually what we do.

Now you're going completely crazy. Not even HUMANS do that.


Yes we do. We are able to learn autonomously withouth being told what everything is.

Go back to posting on something you know anything about.


You always turn into a condescending asshole whenever someone questions your assumptions.
torbjorn_b_g_larsson
4.6 / 5 (9) Jul 01, 2016
Poor fellow.

But: predictable responses. Despite that the fact is that both the responsible driver and the assist automation failed. Never mind the statistical fact that this is the first accident with the assist involved compared to many more with just the driver. Still the automation will be blamed by some.

Luckily administrations like US and here in Sweden will continue push for these safety features.

@Eikka: "there's the slight problem of fitting a TOP500 supercomputer in a Tesla".

Which is why the algorithms emulate human information concentration, "attention". I believe precisely Tesla has already developed one that runs real time on the distributed network of the car.

The need for an AI is pushed to slow stuff like deciding routing, and out to the cloud. No need for more horsepower in a car.

Besides, all the self-driving humans are operating on some sort of trick or shortcut that will prove inadequate in the long run. That is why people make more accidents!
Eikka
3 / 5 (2) Jul 01, 2016
Which is why the algorithms emulate human information concentration, "attention". I believe precisely Tesla has already developed one that runs real time on the distributed network of the car.


That was in reply to Anti-Alias' claim that you could add a 360 degree vision and have the computer process every single pixel.

For example, the human vision - what we can see by turning our heads and eyes - at the resolution we can discern is on the order of 1 gigapixels. Assing 32 bits per pixel and you get 4 gigabytes of raw data per snapshot, by 60 times a second is 240 GB/s

Now you have to process that data in real time, because you're paying "attention" to all of it.

Modern and upcoming GPUs have just about that amount of memory bandwidth, but the problem is getting the data in, and doing anything with it because there's nothing left over. You need a cluster of computers to pull it off, and that requires kilowatts of power.
SamB
3.7 / 5 (6) Jul 01, 2016


This is exactly what I've been saying about these things. The software is just too -dumb- to understand what it is actually looking at even if it can see everything, so it has to rely on hard coded dumb rules about what is what. Then, in order to avoid false positives and erratic over-reactions, they err on the side of false negatives and people get killed.



Eikka, You should definitely give Mr. Musk a call. They obviously did not know such a brilliant engineer like yourself existed til now.
Because of your superior insight into these complex problems and I for one will personally reccommend you to Mr. Musks engineering team!
krundoloss
5 / 5 (1) Jul 01, 2016
I remember Driving shortly after hurricane Floyd in 1996, and saw that the road ahead looked odd, so I slowed down, and as I got closer, I noticed that the road was washed out completely. I had to turn around. How would a Self Driving Car handle that? Something that should not be, but there it is anyway. It would seem that, to do this right, we would NEED AI of some type to make it work. Something that can "conceptualize" and not just run algorithms. Only when the Car is aware of the PHYSICS involved, and can detect and recognize objects and predict the physics related to that object while travelling down the road, it is not a reliable self-driving mechanism.
antialias_physorg
3.7 / 5 (6) Jul 01, 2016
Doing live image analysis isn't that CPU intensive (actually it's not CPU intensive at all - you mosty do that stuff on the GPU these days). There's no 'supercomputers' needed.
Heck, you can see autonomous RC cars racing each other at incredible speeds with nothing more than an Arduino board for support of their computer vision.

But surely if you know of a way to tell a computer to *comprehend* what it is looking at you should start ironing that tuxedo and booking your flight to Sweden.

Sheesh. The nerve of some know-nothings is really incredible.
greenonions
4.3 / 5 (6) Jul 01, 2016
SamB
They obviously did not know such a brilliant engineer like yourself existed til now.
Your comment seems very on target to me. Tesla, Google, all the major car companies, etc. etc. have teams of engineers working 24/7 on this stuff. They obviously understand the liability involved. There is probably communities of engineers - who are sharing information, and pushing our knowledge forward - one step at a time. Then commenters on the internet - sitting at their terminals at home - declare "they are all idiots - just let me do it - I will show them how it is done!" What an odd world.
kochevnik
1.8 / 5 (5) Jul 01, 2016
Yes we do. We are able to learn autonomously withouth being told what everything is.
Not really. Human knowledge only advanced once writing was invented. Predication largely requires social feedback across timespace, Without predication one has only adjectives and adverbs and performs at preschool level
antialias_physorg
4.2 / 5 (5) Jul 01, 2016
Assing 32 bits per pixel and you get 4 gigabytes of raw data per snapshot, by 60 times a second is 240 GB/s

That's not how it works. You check for motion and edges and that's what you process (also there is absolutely no need for full retina resolution because things on the road that are relevant aren't that small)

A modern graphics card (or even a quite old one) is easily able to handle this. You haven't been keeping up with the vision challenges (e.g. quality control) that are currently in place in factories and what enormous speeds these work (and with what littel in the way of computing power)

The thing that failed in this case is that the background (sky) was the same color as the foreground (truck) so no edge was detected. If you detect no edge you can calculate no motion vector field.

We are able to learn autonomously withouth being told what everything is.

No. If no one tells you what a truck is you don't know what a truck is.
antigoracle
2.3 / 5 (6) Jul 01, 2016
Tesla, Google, all the major car companies, etc. etc. have teams of engineers working 24/7 on this stuff.

So too is Microsoft, yet they gave us the infamous BSOD. In the case of Tesla and Google however, that "crash" literally means death. I find Musk's tweet rather telling of the dangerous path they have taken, in that they let "false braking events" override "life saving events", especially with what appears to be, no warning.
greenonions
4.3 / 5 (6) Jul 01, 2016
So too is Microsoft, yet they gave us the infamous BSOD.
Well let's see your operating system - and see how it stacks up against windows. If their path is as 'dangerous' as you are so afraid of - they will have to face the consequences of causing accidents - and the subsequent law suits that will follow. My money would definitely be on their autonomous driving system - rather than yours or Eikka's. When autonomous cars are available - I will have no problem in driving in one. I have no problem in flying in a plane that is being run by computers. You hide under the blankets if you want.
antigoracle
2 / 5 (4) Jul 01, 2016
My money would definitely be on their autonomous driving system...

Of course you would, being the consummate retard. Now why don't you put your money where your ignorant mouth is and get into a Tesla and look for a trailer truck turning in front of you and let that car do you a favour.
adam_russell_9615
5 / 5 (2) Jul 01, 2016
Ive seen the streetview image of that intersection and I think it possible that the accident was unavoidable. When someone pulls an unsafe left turn in front of you it is extremely hard to avoid.
TheGhostofOtto1923
5 / 5 (9) Jul 01, 2016
The Tesla autopilot has a radar, a camera, and a long-range ultrasonic sensor, and it relies more on the radar and sonar because the image recognition algorithms are so primitive
Except that

""It's probably better than a person right now" at driving, Musk said during a conference call with reporters [6 mos ago]"
https://www.washi...drivers/
That means the color or background of the trailer were irrelevant - there's no way the car didn't "see" the obstacle - the AI must have failed to interpret and react to the situation
-And so we can expect the tech to improve as a result of careful analysis, while dumbass humans will continue to run into tractors for ever.
The driver was probably nodding off or looking somewhere else
The driver was apparently watching a harry potter movie.
TheGhostofOtto1923
5 / 5 (9) Jul 01, 2016
What I think happened here is that the forward facing radar identified the side of the truck, but the radar can't tell the height of the object - only the distance to it.

So the computer switched to camera vision to ask "is it a road sign?" and the camera took a look at the top of the trailer and saw a big flat area against the sky and went "yep, it's a roadsign" and the AI determined it can limbo underneath it.

Problem is, the programmers didn't realize that something could be a "road sign" and also hang too low to safely pass under
Really, arent you guessing about all of this eikka?
krundoloss
5 / 5 (3) Jul 01, 2016
Tesla said on its website that neither the driver nor the Autopilot noticed the white side of the trailer, which was perpendicular to the Model S, against the brightly lit sky, and neither applied the brakes.


If the situation could fool a human, more than likely it would fool the self-driving car as well. I don't doubt self driving as being viable, however, I do not look forward to the crapstorm that is going to happen every time a collision occurs when self-driving cars are involved. Some accidents are unavoidable, and its not fair to blame the car for an unavoidable collision.
Jarrod1937
2 / 5 (8) Jul 01, 2016

For example, the human vision - what we can see by turning our heads and eyes - at the resolution we can discern is on the order of 1 gigapixels. Assing 32 bits per pixel and you get 4 gigabytes of raw data per snapshot, by 60 times a second is 240 GB/s


First, assuming you're walking about regular RGB standards and not some odd bit scheme. It's 8 bits per channel, 32 means there is an alpha channel and there is no need for that, so it's really 24 bits. Secondly, unless color is important to your CV, depending on your processing, it is not. Thus you can get away with 8 bits only, offering 256 levels of gray gradations.
And let's try to backward engineer your calculations:

4GB per frame = 4096MB = 4,194,304KB = 4,294,967,296 Bytes = 34,359,738,368 bits

Let's divide this by your assumed 32 bits per pixel, that is 1,073,741,824 pixels, which is equal to 1.074 x 10^9, or 1 gigapixel(s), or 1073.741 megapixels!
greenonions
5 / 5 (4) Jul 01, 2016
goracle
Of course you would, being the consummate retard
Well - that just shows your childish/vacuous/insulting inability to add substance to comments - just rude - pointless garbage. You keep promising that I am on ignore - your ability to keep a promise - is about as good as your ability to contribute substance.
antigoracle
2 / 5 (4) Jul 01, 2016
It would be interesting to know if these cars have a blackbox system that can aid in more accurately analyzing the events leading to the accident and how the AI responded.
Jarrod1937
2.3 / 5 (9) Jul 01, 2016


For example, the human vision - what we can see by turning our heads and eyes - at the resolution we can discern is on the order of 1 gigapixels. Assing 32 bits per pixel and you get 4 gigabytes of raw data per snapshot, by 60 times a second is 240 GB/s

Continued...
The highest estimation of human visual acuity to pixel representation (a slightly inaccurate comparison) is 500 megapixels (the average is around 100). So, divide your figure in half there. Further, if we assume we're only interested in black and white, divide your number by 4.
This would now yield only a 0.5 GB per frame or 30 GB/s of data to parse and process. Even this is flawed though, as mentioned by someone else, you can calculate motion vectors, do grid processing...etc. These types of processes can easily reduce the workload.
Jarrod1937
2 / 5 (8) Jul 01, 2016
It would be interesting to know if these cars have a blackbox system that can aid in more accurately analyzing the events leading to the accident and how the AI responded.

They're constantly parsing and recording data and then uploaded for Tesla to analyze, so yes there is a "blackbox" of sorts.
Jarrod1937
1.9 / 5 (9) Jul 01, 2016
Tesla, Google, all the major car companies, etc. etc. have teams of engineers working 24/7 on this stuff.

So too is Microsoft, yet they gave us the infamous BSOD.


The BSOD is partly for a few different reasons:
- They provide an OS for other companies to write software for. At this point though, they sandbox memory and other items, so the OS doesn't crash from misbehaving programs.
- They provide an OS for other companies to write drivers for. Because of the nature of drivers, they can't be sandboxed in the same way. The most common cause of a BSOD are drivers.
- Errors in hardware are happening all the time, but are corrected via hamming codes and other ECC. However, most consumer PC's, due to cost don't include ECC ram, which helps protect from this.

In otherwords, it is the state of the technology, not the company.
antigoracle
1.8 / 5 (5) Jul 01, 2016
Well - that just shows your childish/vacuous/insulting inability to add substance to comments

Being the consummate retard it's beyond your capacity to realize that the substance you are adding is the shite overflowing from your head.
As for childish/vacuous/insulting, you should have someone with a brain explain, what is trolling the forum, blindly down voting everyone you consider an adversary, out of pure ignorance and spite.
Jarrod1937
2.6 / 5 (10) Jul 01, 2016
Well - that just shows your childish/vacuous/insulting inability to add substance to comments

Being the consummate retard it's beyond your capacity to realize that the substance you are adding is the shite overflowing from your head.
As for childish/vacuous/insulting, you should have someone with a brain explain, what is trolling the forum, blindly down voting everyone you consider an adversary, out of pure ignorance and spite.


I would like to add that since I created an account, I am excited that I can now comment on your often misunderstandings of science. Nothing personal.
greenonions
5 / 5 (5) Jul 01, 2016
goracle
It would be interesting to know if these cars have a blackbox system that can aid in more accurately analyzing the events leading to the accident and how the AI responded.
Gosh - you really don't read much in the tech/science world do you? Tesla cars collect a great deal of data - and store it for use by the company. Several owners have been caught trying to lie about problems on the cars - and the data has shown the real story. Here is one example - http://electrek.c...t-fault/ The cars have many sensors - and even record information like - how many times the hood has been opened.
antigoracle
2 / 5 (4) Jul 01, 2016
I would like to add that since I created an account, I am excited that I can now comment on your often misunderstandings of science. Nothing personal.

Congratulations!
Nothing personal, but your first misunderstanding is that I misunderstand science, but I do, hopefully, expect you to improve.
Jarrod1937
2 / 5 (8) Jul 01, 2016
I would like to add that since I created an account, I am excited that I can now comment on your often misunderstandings of science. Nothing personal.

Congratulations!
Nothing personal, but your first misunderstanding is that I misunderstand science, but I do, hopefully, expect you to improve.

Fair enough, you may see my comments trailing yours soon ;-)
greenonions
5 / 5 (5) Jul 01, 2016
Jarrod

I would like to add that since I created an account, I am excited that I can now comment on your often misunderstandings of science. Nothing personal.
You were not clear who that was directed at. If at me - no offense taken. Always happy to be corrected if I misunderstand something. Could you give examples? I usually try to comment on things I have checked into pretty well - stay away from cosmology, quantum physics etc. Thanks.
Jarrod1937
2.8 / 5 (11) Jul 01, 2016
@greenionions, definitely targeted toward antigoracle, not you. However, same applies towards me, always open if I misunderstand something as well.
tblakely1357
2 / 5 (4) Jul 01, 2016
"In the interest of safety, ALL vehicles and trailer modules should be equipped with transponders (independently if they have autopilot capabilities or not). Just a short range radio device transmitting GPS position, vehicle dimensions and speed vector could cost very few dollars per vehicle and it should be obligatory."

Could be some serious privacy issues with that set up. I'm sure various companies and the government itself would never be interested in your driving habits, right? Can you imagine the revenue stream from tickets issued based on your GPS data? Lol, have fun fighting those in court.
antigoracle
2.3 / 5 (3) Jul 01, 2016
It would be interesting to know if these cars have a blackbox system that can aid in more accurately analyzing the events leading to the accident and how the AI responded.

They're constantly parsing and recording data and then uploaded for Tesla to analyze, so yes there is a "blackbox" of sorts.

Yes, I was aware that they record and analyze data. I specifically stated "blackbox" meaning a system that would survive catastrophic accidents, but I should have been explicit.
TheGhostofOtto1923
5 / 5 (7) Jul 01, 2016
It would be interesting to know if these cars have a blackbox system that can aid in more accurately analyzing the events leading to the accident and how the AI responded
They can and they will, but more importantly their activities can be transmitted and analyzed in real time. And any critical improvements can be uploaded to them as well as to all relevant vehicles.

So for instance if a software flaw becomes apparent in one vehicle under certain conditions, all of them can immediately be corrected.

In addition these vehicles will be talking to each other and conveying info about unique conditions they are encountering. So for instance if one has trouble identifying the white side of a truck it can let others in the vicinity know about the problem, or even view the truck from the perspective of adjacent vehicles, traffic cams, and drones.

It could even ping the truck itself because all commercial vehicles will eventually be self-driving.
antigoracle
1.7 / 5 (6) Jul 01, 2016
They can and they will, but more importantly their activities can be transmitted and analyzed in real time. And any critical improvements can be uploaded to them as well as to all relevant vehicles.

So for instance if a software flaw becomes apparent in one vehicle under certain conditions, all of them can immediately be corrected.

That's all well and good, but still too late when they overlook the simple fundamental rules. As they did in this case i.e. recognize that there is an obstacle in your path and do not drive into it.
kochevnik
3 / 5 (4) Jul 01, 2016
It could even ping the truck itself because all commercial vehicles will eventually be self-driving.
But GPS will allow Tesla to know truck exists in first place. That is sufficient to avoid this accident in first case
mondoblu
2.3 / 5 (3) Jul 01, 2016
Autonomous cars are a risk, because it is virtually impossible to predict in advance all the possibilities that can occur in the real world.

Ok, even humans cannot manage all situations, but in case of unexpected situations I prefer to be responsible for my own acts instead of being in the hands of any buggy AI.
Da Schneib
5 / 5 (1) Jul 01, 2016
...I prefer to be responsible for my own acts instead of being in the hands of any buggy AI.
Pilots do it all the time. Ever heard of "fly by wire?"

The problem with the Tesla is they didn't put lidar on it.
luke_w_bradley
3.4 / 5 (5) Jul 01, 2016
n the interest of safety, ALL vehicles and trailer modules should be equipped with transponder....

This really is the smartest comment here. The REAL advantage to this is a smart traffic light system. So many millions of gallons of gas and millions of frustrated hours would be saved if traffic lights made smart decisions, and advised people on speed. You'd never have to stop in moderate traffic.

But its not the final answer for safety. I've seen trees, cows, boulders in the road.
adam_russell_9615
3 / 5 (2) Jul 02, 2016
I really dont see the draw in an almost-self-driving car. If you have to constantly be on alert to take back control then it seems like that would be more work and stress than just normal driving.
TheGhostofOtto1923
5 / 5 (7) Jul 02, 2016
That's all well and good, but still too late when they overlook the simple fundamental rules. As they did in this case i.e. recognize that there is an obstacle in your path and do not drive into it
Its evolution. Constant improvement from lessons learned.

Meanwhile we are not evolving better human drivers.
The REAL advantage to this is a smart traffic light system. So many millions of gallons of gas and millions of frustrated hours would be saved if traffic lights made smart decisions, and advised people on speed. You'd never have to stop in moderate traffic
Vehicles themselves can do this. Imagine 1000s of traffic cams in constant motion, with multisenses in addition to cameras, offering different perspectives simultaneously of ever-changing traffic conditions.

Traffic cams and drones only offer greater depth in the 3rd dimension.

Smart cars will be their own constantly evolving network.
greenonions
5 / 5 (3) Jul 02, 2016
I really dont see the draw in an almost-self-driving car. If you have to constantly be on alert to take back control then it seems like that would be more work and stress than just normal driving.
Some friends of ours were killed last August - when a truck driver had a heart attack, and the 18 wheeler crossed the median - and demolished their van. A very basic level of driver assist would have kept the truck in it's lane. Safety is probably the biggest factor in this kind of technology.
Da Schneib
5 / 5 (2) Jul 02, 2016
I really dont see the draw in an almost-self-driving car. If you have to constantly be on alert to take back control then it seems like that would be more work and stress than just normal driving.
I recently got a Q50 (I think that's what it's called) as a loaner and it had warnings for cars next to me, getting too close to the car in front of me, and, really nice, would watch the following distance and keep it in cruise control, as well as warning me about intrusions into my lane between me and the car I was following. I was still driving it, but it definitely reduced the cognitive load. I found I was able to look out further for upcoming problems ahead of me; this is IMHO the gauge of a good vehicle.

I still think that Tesla blew it by not putting lidar on their cars.
antigoracle
2.3 / 5 (3) Jul 02, 2016
Pilots do it all the time. Ever heard of "fly by wire?"

The problem with the Tesla is they didn't put lidar on it.

LOL.
"fly by wire" in no way or form means an AI is in control of the aircraft.
Ever heard of a "brain"? Grow one and try to use it.
The problem with the "You" is they didn't put common sense in it.
antialias_physorg
3 / 5 (2) Jul 02, 2016
I really dont see the draw in an almost-self-driving car.

It could be a feature that sells in countries with highway systems that have long stretches of uninterrupted road (no off/on ramps) and light traffic. Also with stay-in-your-lane traffic (as opposed to the pass-on-the-left kind)

But other than that I agree. I don't see much use for this kind of semi-autonomous assist.

As for the less intrusive assist tools ("other car too close"-warnings, etc.)...those aren't really useful because by the time they detect something truly dangerous it's too late.

There are helpers that I still think are useful (e.g. something that initiates an emergency braking maneouver)...but for normal road operations I'd rather drive myself.

If we're talking autonomous then let it be fully autonomous.
Da Schneib
5 / 5 (2) Jul 02, 2016
As for the less intrusive assist tools ("other car too close"-warnings, etc.)...those aren't really useful because by the time they detect something truly dangerous it's too late.
Actually I had to dial them down to the medium setting for them not to overreact. Remember that in cruise control the computer has control of both the gas and brakes. Also of particular note was the warning light next to the rear view mirror on each side, which confirmed my habitual glance over my shoulder on every occasion. I didn't ever count on them, though; I wasn't going to be driving that car long enough to ever let that habit slip even once.
antialias_physorg
3 / 5 (2) Jul 02, 2016
Exactly my point: Either they warn you all the time about stuff that isn't critical...or you set the sensitivity so low that it doesn't matter anymore.
Certainly they shouldn't train you off the good habits one has developed from training and experience..and I see that as one of the major issues: trusting hardware/software without fully understanding its limitations.
Da Schneib
5 / 5 (4) Jul 02, 2016
Exactly my point: Either they warn you all the time about stuff that isn't critical...or you set the sensitivity so low that it doesn't matter anymore.
Medium worked fine. If it's just a matter of calibrating the response, that's no biggie; I'm a power user of systems, personally. It's kind of what I do for a living. I'm a systems engineer.

Certainly they shouldn't train you off the good habits one has developed from training and experience..and I see that as one of the major issues: trusting hardware/software without fully understanding its limitations.
My approach to machines is to integrate what they do with what I do, not leave it to them to take over. I'd have to watch a machine for a long time before I'd let one take over. So it seems like we agree on that. When this stuff has been around for a decade or so in wide use, I might consider letting a machine drive. Until then I'm in favor of keeping an eye on it.
Zenmaster
2 / 5 (4) Jul 02, 2016
Isn't the current system advertised as being SAE level-2 limited with a stated requirement of full driver attention at all times? And isn't the behavior of the car in this case entirely compliant with the these conditions?
xponen
2.3 / 5 (3) Jul 02, 2016
The victim of the crash, Joshua Brown, mentioned a bug few month ago, in YouTube comment section where he uploaded a video praising his Tesla-S evading side collision with a small truck, about the AI being unable to detect static car.

Another issue mentioned is the fact that the AI don't behave properly when passing over a hill where the rest of the road not visible.

The AI is obviously, could fail from a obvious simple reason, just like the one @Eikka proposed, and the level of safety is determined by how fast the AI developer can fix it whenever it is discovered/reported , which is at snail paced slow...

That said, it is important to be aware of the danger of using driving AI, but still, any sort of automation (such as an AI) reduce driving workload, which significantly improve your cognitive function, which translate to higher safety. But remember, you are still the driver.
xponen
2.8 / 5 (5) Jul 02, 2016
The amount of downvote for @Eikka's comment suggest the public aren't mentally prepared to use an AI. The wrong or right of the AI suddenly became politicised and trigger emotion.
jimbo92107
1 / 5 (3) Jul 03, 2016
What I think happened here is that the forward facing radar identified the side of the truck, but the radar can't tell the height of the object - only the distance to it.

The trouble here is that the computer doesn't even try to reason like that. It just operates on a simple checklist of rules.


Needs better physics engine. Objects forward must be classified as obstacle/no obstacle first in sequence. Shape and distance is first priority in path of vehicle. Street sign or truck should not matter. Insufficient room must result in immediate braking. Could be a basic software design flaw.
thisisminesothere
1 / 5 (4) Jul 03, 2016
Its naive to think that calling something "Autopilot" would not translate into people paying less attention to the road. Driver assist tools are just that, to ASSIST. If a company makes the claim that the car can drive itself (IE, calling it AUTOPILOT), yet you still have to pay attention, it is only human nature to eventually pay attention elsewhere while the car is doing the driving.

I understand that this is a stepping stone to full automation, but I dont see it going well being that the driver still needs to be "in control" when the car is SUPPOSED to be "in control". Humans get distracted enough already KNOWING they have to pay 100% attention to the road, give that control to something else, and you will see less a lot more phones flipped sideways while people sling their angry birds.
PhysicsMatter
1.8 / 5 (5) Jul 03, 2016
Part-time autonomous systems have one fatal flaw, they impair driver alertness by illusion that car drives itself temporarily and hence a focus and tension of alertness decreases in a way similar to drug or alcohol impairment. Even autopilot on airplanes must be monitored by at least one highly trained pilot and their reactions are not split second but tens of seconds or minutes and uncertainties of traffic are smaller.

If Tesla autopilot requires the same level of alertness as driving, what's the point, the fatigue is the same, no utility like texting or calling or working can be exploited. So is that just fancy cruise control with all undeserving hype?

There are still severe problems with autonomous driving, psychological, physiological, legal and believe or not still technical and economic, some of them are listed here:

https://sostratus...ture-av/

If this technology is doomed, it will be lawyers who will do it in.
PhysicsMatter
1.7 / 5 (6) Jul 03, 2016
"In January, he [Musk] said that Autopilot is "probably better than a person right now.""

What is more appalling is media expressed attitude of intelligence superiority of technological oligarchs like Musk, and their sick delusional dreams of grandeur and super-humanity divorced from reality shared by all of us.

How we can seriously trust our lives to people like Musk who think that we are cyborgs or that we are living in fake digital reality and that [tech gurus] should be able to launch people [to Mars] in 2025 for commercial purposes." Or that as CEO of a corporation Musk is trying to maximize Earth-based revenue before he moves on to another planet. May be Musk thinks that victim of this tragedy was "digitally deleted" not brutally decapitated.

This was not only collision between Tesla car "not" driven by a person and a truck but most importantly collision between Musk technological delusions and harsh and hard reality, the rest of us are living in.
rrrander
1 / 5 (6) Jul 03, 2016
Tesla and that ego maniac Musk may have created a death machine now, but synchronization and control of cars, at least in cities is inevitable. Traffic will increase 60% in major cities by 2025, imagine if (just one example) cars at a stop-light all 'go" at the same time, no one, then two, then three...This will be needed. All control should't be ceded to computers, despite the wishes of greedy insurance companies who have failed to reduce their rates, even though the injury and death rates in cars are 1/3 of what they were 30 years ago. They will push government to adopt these controls.
rrrander
1 / 5 (5) Jul 03, 2016
Just like to add one more thing. A while back, I was riding in the outside lane of a highway, the guardrail lane, and a large truck was next to me. Inexplicably, the truck decided to go to the outside lane and didn't see me. I honked, no effect. I was in a 2004 Mustang Mach-1 and the only way I was able to escape being crushed between the guard rail and the truck was a full-power acceleration. I was traveling over 100mph when I got in out in front of the truck. Would Tesla's program have allowed such a move? The car is capable of it, but would it have done it? If not, it would have been sandwiched.
KBK
2.3 / 5 (3) Jul 03, 2016
...I prefer to be responsible for my own acts instead of being in the hands of any buggy AI.
Pilots do it all the time. Ever heard of "fly by wire?"

The problem with the Tesla is they didn't put lidar on it.


Lidar, IIRC, is velodyne (same company), and also military/pro, so it has good margins in it's profit.

This puts it in the catbird seat with regard to pricing, as it can stand the brunt of being out of some people's reach and prohibitively expensive to others. Military contracts and profit margins allowing the technology holders to force the market to adopt at high prices.

Not saying this true but it is an area to investigate, regarding why Tesla does not use lidar.

If true, when the given relevant patents expire or when lower cost alternatives in component build come around (or some combination thereof), you might see it adopted by Tesla.
KBK
2.3 / 5 (3) Jul 03, 2016
Tesla and that ego maniac Musk may have created a death machine now, but synchronization and control of cars, at least in cities is inevitable. Traffic will increase 60% in major cities by 2025, imagine if (just one example) cars at a stop-light all 'go" at the same time, no one, then two, then three...This will be needed. All control should't be ceded to computers, despite the wishes of greedy insurance companies who have failed to reduce their rates, even though the injury and death rates in cars are 1/3 of what they were 30 years ago. They will push government to adopt these controls.


The highway is not a freedom frontier. It is a licensed and controlled system from day one. Driving is a privilege in the west, not a right.

I like ass kicking cars as well, don't get me wrong. In the hand of any person, any car is a dangerous weapon.

I have no real problem ceding control to a well designed computer system. I'll actually arrive notably sooner (in cities).

kochevnik
2.6 / 5 (5) Jul 03, 2016
The highway is not a freedom frontier. It is a licensed and controlled system from day one. Driving is a privilege in the west, not a right.
Freedom of travel is not a privilege. It is a fundamental human right. Driving in Blacks Law Dictionary refers to transport, which is carriage for commercial gain. Travel is not commercial and state has no authority to limit travel but to the degree of your ignorance and their brainwashing

You conflate Vatican maritime commercial law with common law
TheGhostofOtto1923
4 / 5 (8) Jul 03, 2016
Another issue mentioned is the fact that the AI don't behave properly when passing over a hill where the rest of the road not visible
Humans have the same problems on roads they're not familiar with. But AI cars have the advantage when they begin talking to each other and sharing knowledge of road conditions. A car that is familiar with that road can educate other vehicles that arent, either directly or through a constantly-improving database.

And vehicles will be reporting maintenance issues in realtime.
Freedom of travel is not a privilege. It is a fundamental human right. Driving in Blacks Law Dictionary refers to transport, which is carriage for commercial gain
So walk already. Call Uber. Take the bus.

Thinking that the govt owes everybody everything is a residual illusion in ex-communist countries.
dogbert
2.6 / 5 (5) Jul 03, 2016
It will be interesting to see the assignment of liability in this accident. Will Tesla be held accountable for wrongful death of the operator?
retrosurf
2.3 / 5 (3) Jul 03, 2016
... Inexplicably, the truck decided to go to the outside lane and didn't see me. I honked, no effect. I was in a 2004 Mustang Mach-1 and the only way I was able to escape being crushed between the guard rail and the truck was a full-power acceleration. I was traveling over 100mph when I got in out in front...


An autopilot would could have helped in so many ways, first of all by keeping you out of that situation by using what I call the "avoid the giants" rule, and then the "lateral safety" rule, followed by using the "don't linger in blind spots" rule. Finally ego-less software would know that the braking performance on a 2004 Mustang Mach-1 is twice as good as its acceleration (60-0 in 120 feet, 0 to 60 in 246 feet), and braking has less latency, and that escape to the rear is usually best in those situations.

(performance.ford.com/enthusiasts/collector-vehicles/mustang/mach-1/2004.html)
Zenmaster
2.3 / 5 (3) Jul 03, 2016
@dogbert : "It will be interesting to see the assignment of liability in this accident. Will Tesla be held accountable for wrongful death of the operator?"

AFAIK, Tesla would only be held liable if it could be proven that the driver was somehow prevented from controlling the car. For example, if the car overrode the braking and/or steering controls in such a manner as to contribute to the accident.

As far as liability for the car not evading the collision? Of course not, because driving is a 100% human responsibility and will be for some time to come. It's currently illegal in every state to even permit the level of Autopilot require non-aware driving. AFAIK the driver simply didn't brake, but could have done so if paying attention (as explicitly required by the current system).
retrosurf
3.4 / 5 (5) Jul 03, 2016
This was not only collision between Tesla car "not" driven by a person and a truck but most importantly collision between Musk technological delusions and harsh and hard reality, the rest of us are living in.


I think Musk/Tesla has just collided with reality. The hybris of calling it Autopilot, which is an inflated marketing term for what it *wants* to be someday, will cost him actual money.

It's really NotAPilot.
kochevnik
1 / 5 (4) Jul 03, 2016
Thinking that the govt owes everybody everything is a residual illusion in ex-communist countries.
I find it much more prevalent in the West. You may be thinking of West Ukraine or Poland or other catholicized hives. Specifically, Westerners think operating a car is driving, when driving is a COMMERCIAL activity. Traveling is not driving. Operating a car is not driving, unless money changes hands. So deep is the newspeak

The only thing government owes the people is to vanish. Even sewers and roads are better managed with collaborative kickstarter projects than the old authoritarian model
TheGhostofOtto1923
4.6 / 5 (9) Jul 03, 2016
@dogbert : "It will be interesting to see the assignment of liability in this accident. Will Tesla be held accountable for wrongful death of the operator?"
Uh don't you think they already had that all figured out, and were required to present it to the relevant authorities, before that car and driver were allowed on the road?

Why not do a little research and get back to us-
Waaalt
1 / 5 (3) Jul 03, 2016
The very term "Autopilot" will be costing Tesla here. False advertising meets wrongful death. It's only any "autopilot" until the car will kill you if you don't suddenly wake up to make the right choice.

The industry as a whole is well documented to be aware of this issue. They legally need a human to be sitting behind the wheel and be liable, yet expect them to do nothing the vast majority of the time. They are selling an autopilot yet legally can't allow one.

Furthermore, a human is simply not capable of a machine-like intervention when expected to do nothing most of the time. To expect this was a fiction created by the industry for legal convenience.

Everyone already knows it should be the other way around: the human should drive most of the time, with the machine able to intervene to prevent mistakes. A machine never gets bored from always doing nothing.

In other words, the current 'terms of service' will no way make it through the first few death related lawsuits.
dogbert
2.3 / 5 (6) Jul 03, 2016
TheGhostofOtto1923
Uh don't you think they already had that all figured out, and were required to present it to the relevant authorities, before that car and driver were allowed on the road?

Why not do a little research and get back to us-


No, I don't. I will watch to see what happens post accident with liability issues.
TheGhostofOtto1923
4.5 / 5 (8) Jul 04, 2016
Dog, they won't even let you and your minivan on the highway without registration and proof of insurance. What makes you think liability issues weren't thoroughly vetted before these cars were allowed on the road?

Their high profile in the public eye makes the liability issue especially sensitive.

Instead of guessing, I'm only suggesting you do a little work and find out what the situation actually is.
TheGhostofOtto1923
4.6 / 5 (9) Jul 04, 2016
In other words, the current 'terms of service' will no way make it through the first few death related lawsuits
AGAIN, what makes you geniuses think that this hadn't occurred to musk and Google et al, as well as all the greedy politicians looking for new sources of revenue, long before they decided to fund the venture?

You really think you're more forward-thinking than the 1000s of lawyers, scientists, engineers, investment analysts, CEOS etc who have been working on this for decades??

Of course you do.
TheGhostofOtto1923
4 / 5 (8) Jul 04, 2016
I find it much more prevalent in the West. You may be thinking of West Ukraine or Poland or other catholicized hives
Uh oh. You guys aren't gonna invade them next are you?
kochevnik
3.4 / 5 (5) Jul 04, 2016
I find it much more prevalent in the West. You may be thinking of West Ukraine or Poland or other catholicized hives
Uh oh. You guys aren't gonna invade them next are you?

What has Russia invaded? USA has invaded a dozen countries in as many years
TheGhostofOtto1923
5 / 5 (6) Jul 05, 2016
I find it much more prevalent in the West. You may be thinking of West Ukraine or Poland or other catholicized hives
Uh oh. You guys aren't gonna invade them next are you?

What has Russia invaded? USA has invaded a dozen countries in as many years

https://www.washi...-Crimea/

-I guess they still dont let you read the papers. So sad.
adam_russell_9615
1 / 5 (1) Jul 06, 2016
http://www.nytime...car.html

NY Times wrote:
But Google decided to play down the vigilant-human approach after an experiment in 2013, when the company let some of its employees sit behind the wheel of the self-driving cars on their daily commutes. Engineers using onboard video cameras to remotely monitor the results were alarmed by what they observed — a range of distracted-driving behavior that included falling asleep. "We saw stuff that made us a little nervous," Christopher Urmson, a former Carnegie Mellon University roboticist who directs the car project at Google, said at the time. The experiment convinced the engineers that it might not be possible to have a human driver quickly snap back to "situational awareness," the reflexive response required for a person to handle a split-second crisis.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.