Should your driverless car kill you to save a child's life?

Aug 01, 2014 by Jason Millar, The Conversation
Big decision for a small car. Credit: laughingsquid, CC BY-NC-ND

Robots have already taken over the world. It may not seem so because it hasn't happened in the way science fiction author Isaac Asmiov imagined it in his book I, Robot. City streets are not crowded by humanoid robots walking around just yet, but robots have been doing a lot of mundane work behind closed doors, which humans would rather avoid.

Their visibility is going to change swiftly though. Driverless cars are projected to appear on roads, and make moving from one point to another less cumbersome. Even though they won't be controlled by , the software that will run them raises many ethical challenges.

For instance, should your robot car kill you to save the life of another in an unavoidable crash?

License to kill?

Consider this thought experiment: you are travelling along a single-lane mountain road in an that is fast approaching a narrow tunnel. Just before entering the tunnel a child attempts to run across the road but trips in the centre of the lane, effectively blocking the entrance to the tunnel. The car has but two options: hit and kill the child, or swerve into the wall on either side of the tunnel, thus killing you.

Both outcomes will certainly result in harm, and from an ethical perspective there is no "correct" answer to this dilemma. The tunnel problem serves as a good precisely because it is difficult to answer.

The tunnel problem also points to imminent design challenges that must be addressed, in that it raises the following question: how should we program autonomous cars to react in difficult ethical situations? However, a more interesting question is: who should decide how the car reacts in difficult ethical situations?

This second question asks us to turn our attention to the users, designers, and law makers surrounding autonomous cars, and ask who has the legitimate moral authority to make such decisions. We need to consider these questions together if our goal is to produce legitimate answers.

At first glance this second question – the who question – seems odd. Surely it is the designers' job to program the car to react this way or that? I am not so sure.

From a driver's perspective, the tunnel problem is much more than a complex design issue. It is effectively an end-of-life decision. The tunnel problem poses deeply moral questions that implicate the driver directly.

Allowing designers to pick the outcome of tunnel-like problems treats those dilemmas as if they must have a "right" answer that can be selected and applied in all similar situations. In reality they do not. Is it best for the car to always hit the child? Is it best for the car always to sacrifice the driver? If we strive for a one-size-fits-all solution, it can only be offered arbitrarily.

The better solution is to look for other examples of complex moral decision-making to get some traction on the who question.

Ask the ethicist

Healthcare professionals deal with end-of-life decisions frequently. According to medical ethics, it is generally left up to the individual for whom the question has direct moral implications to decide which outcome is preferable. When faced with a diagnosis of cancer, for example, it is up to the patient to decide whether or not to undergo chemotherapy. Doctors and nurses are trained to respect patients' autonomy, and to accommodate it within reason.

An appeal to personal autonomy is intuitive. Why would one agree to let someone else decide on deeply personal moral questions, such as end-of-life decisions in a driving situation, that one feels capable of deciding on their own?

From an ethical perspective, if we allow designers to choose how a car should react to a tunnel problem, we risk subjecting drivers to paternalism by design: cars will not respect drivers' autonomous preferences in those deeply personal moral situations.

Seen from this angle it becomes clear that there are certain deeply personal moral questions that will arise with autonomous cars that ought to be answered by drivers. A recent poll suggests that if designers assume moral authority they run the risk of making technology that is less ethical and, if not that, certainly less trustworthy.

As in healthcare, designers and engineers need to recognise the limits of their moral authority and find ways of accommodating user autonomy in difficult situations. Users must be allowed to make some tough decisions for themselves.

None of this simplifies the design of autonomous cars. But making technology work well requires that we move beyond technical considerations in design to make it both trustworthy and ethically sound. We should work toward enabling users to exercise their autonomy where appropriate when using technology. When robot cars must kill, there are good reasons why designers should not be the ones picking victims.

Explore further: If you want to trust a robot, look at how it makes decisions

add to favorites email to friend print save as pdf

Related Stories

Really smart cars are ready to take the wheel

Jul 17, 2014

Why waste your time looking for a place to park when your car can do it for you? An idea that was pure science fiction only a few years ago is becoming reality thanks to automatic robot cars.

Teaching robots right from wrong

May 09, 2014

Researchers from Tufts University, Brown University, and Rensselaer Polytechnic Institute are teaming with the U.S. Navy to explore technology that would pave the way for developing robots capable of making moral decisions.

Cruise aims to bring driverless tech to life in 2015

Jun 24, 2014

The old saying why reinvent the wheel will resonate with the coming debut of Cruise technology in certain cars on certain roads next year. The motivating question would be, Why wait to buy a totally driverless ...

Recommended for you

Tablets, cars drive AT&T wireless gains—not phones

7 hours ago

AT&T says it gained 2 million wireless subscribers in the latest quarter, but most were from non-phone services such as tablets and Internet-connected cars. The company is facing pricing pressure from smaller rivals T-Mobile ...

Twitter looks to weave into more mobile apps

8 hours ago

Twitter on Wednesday set out to weave itself into mobile applications with a free "Fabric" platform to help developers build better programs and make more money.

Blink, point, solve an equation: Introducing PhotoMath

9 hours ago

"Ma, can I go now? My phone did my homework." PhotoMath, from the software development company MicroBlink, will make the student's phone do math homework. Just point the camera towards the mathematical expression, ...

Google unveils app for managing Gmail inboxes

9 hours ago

Google is introducing an application designed to make it easier for its Gmail users to find and manage important information that can often become buried in their inboxes.

User comments : 91

Adjust slider to filter visible comments by rank

Display comments: newest first

kochevnik
1.8 / 5 (21) Aug 01, 2014
"The car has but two options: hit and kill the child, or swerve into the wall on either side of the tunnel, thus killing you."

The decision rests upon the ethnicity of the child. In USA the life of a child from the Gaza Strip is worthless, so the car can proceed unabated. However if the obstacle is a human chain consisting of an illegal Latina and her thirteen children, their lives are more sacred than any citizen. Google can do a quick search to settle these American imponderables
antialias_physorg
4.6 / 5 (11) Aug 01, 2014
how should we program autonomous cars to react in difficult ethical situations?

Forget about that. Unless we teach robots the way we teach children (which isn't programming) there's no way we'll get an ethical framework together to cover all cases.
People have tried with religions and laws and philosophy and whatnot for thousands of years to get something that works universally. We haven't managed to put something like that together even in the fuzzy human language.

To get something into the precise language of programming (math) is a couple of orders of magnitude harder, still.

The algorithm should be programmed to minimize projected forces on all biological entities which are possibly involved in the crash. That's the best you can do (and even that sounds out of our league, to me, given the time constraint from the moment it realizes that something is unavoidable to impact).
Scottingham
4.3 / 5 (12) Aug 01, 2014
Totally with antialias on this. I think though that the car should prioritize the safety of the occupants over everything else considered. Unfortunate decisions may occur but occupants of the car must be able to 'trust' that it will consider them the highest priority.

A kid running out into the middle of a busy road, while unfortunate, is Darwin's way of cleaning up the trash. The car shouldn't careen into a telephone pole or oncoming traffic to avoid the child, though if possible evasive maneuvers/hard braking should still happen (of course).
rockwolf1000
4.2 / 5 (10) Aug 01, 2014
Obviously, nature never intended for children who not only run out in front of traffic, but fall down on flat ground, to reproduce.

What moral dilemma?
jspurny
4.7 / 5 (7) Aug 01, 2014
Like every "ethical dilemma", this one is just made up.

This "problem" vanishes, when you start thinking in real world instead of in some philosofiphal construct and there is very simple solution: "When visibility is low and/or your view of areas immediately surrounding the road and/or you see (or can expect) people/children/animals in the vicinity of the road - then SLOW THE F*CK DOWN! ARE YOU TRYING TO KILL SOMEONE?"

There should be no problem to implement an artificial "fear" into car's control, which would make it slow down in problematic situations. Car could "fear" narow streets, "fear" people/children/animals too close to the road, "fear" freezing temperatures, "fear" heavy rain, etc, etc.

If the car designers would not see the real situation, which calls for slowdown, but instead some made up crazy dilemma about who to kill - then I really don't want their murderous crazy car.
nkalanaga
4.7 / 5 (6) Aug 01, 2014
jspurny: I've never heard it expressed that way, but yours is a very good solution, and wouldn't be hard to implement.

Railroads have a rule for such situations, called "restricted speed": Basically, it says "Proceed prepared to stop short of any obstacle on the tracks". The poorer the visibility, or the heavier the train, the slower it must go in such situations.
tadchem
5 / 5 (4) Aug 01, 2014
It has always amused me the way 'ethicists' generate so much unnecessary anxiety for themselves with hypothetical dilemmas.
OneGuy
4.5 / 5 (2) Aug 01, 2014
That's exactly what I wanted to say. The car has a radar and it should monitor the road situation and predict the outcome. If an object's trajectory is going to cross with your car's trajectory at some point, then your car should slow down directly proportional to the distance of the projected impact, and keep monitoring speed and direction of the objects around. So both scenarios are not even going to be the case with self-driving cars. Computer is much faster to monitor and react than a human.
rockwolf1000
5 / 5 (4) Aug 01, 2014

There should be no problem to implement an artificial "fear" into car's control, which would make it slow down in problematic situations. Car could "fear" narow streets, "fear" people/children/animals too close to the road, "fear" freezing temperatures, "fear" heavy rain, etc, etc.

If the car designers would not see the real situation, which calls for slowdown, but instead some made up crazy dilemma about who to kill - then I really don't want their murderous crazy car.


Ya. We could also teach it to fear meteors, car jackers, sink holes, floods, tornadoes, earthquakes. tsunamis, rogue footballs and wild animals. This coupled with the realization that stupid pedestrians could jump out in front at any moment and we would have a car that just sits there and refuses to move anywhere because it's too scared.
cjn
5 / 5 (1) Aug 01, 2014
Jspurney: I'm not sure "fear" is the right word -maybe more to exercise caution during moments of extreme variability or consequence. Phrasing aside, I do agree that it would be a reasonable solution to dilemma presented. Further, programmed "caution" might increase predictability for the occupants and other people/objects in the environment, thereby decreasing the unknowns which hamper rapid decision-making.
nkalanaga
5 / 5 (1) Aug 01, 2014
The context would also be important. If the car was on a controlled access highway with multiple lanes and open space on either side it wouldn't need to "worry" about objects suddenly appearing from the sides. In that case it's main concern would be other vehicles and road surface conditions. A high speed would be acceptable under normal conditions.

On the other hand, in a crowded urban area, it should expect pedestrians, animals, and inanimate objects to appear unexpectedly. Throw in potholes, loose manhole covers, possible loose objects on the road, etc, and slow speeds would be in order, as well as possibly increased monitoring of brake and restraint systems, to make sure they can respond quickly.

On the open road the usual advice for an animal, such as a deer, jumping in front of you is to hit it. Better a collision than to lose control of the car.
djf1965
5 / 5 (1) Aug 01, 2014
Let's look at it from another angle. A man runs out in front of the car with a gun. Does the car stop or run him over!
italba
not rated yet Aug 01, 2014
When a injury to a human cannot be avoided, there could be a minimum damage strategy. If a car would hit a child he will be surely killed, if it crashes to the side maybe the driver, using seat belts and airbags, could survive. Obviously, if the choice is between hitting the child or crashing a full loaded car to a front coming truck, the child should be hit.
djf1965
5 / 5 (2) Aug 01, 2014
Let's look at it from another angle. A man runs out in front of the car with a gun. Does the car stop or run him over!


Now what happens if he's wearing a police uniform - does that make a difference?
rockwolf1000
not rated yet Aug 02, 2014
When a injury to a human cannot be avoided, there could be a minimum damage strategy. If a car would hit a child he will be surely killed, if it crashes to the side maybe the driver, using seat belts and airbags, could survive. Obviously, if the choice is between hitting the child or crashing a full loaded car to a front coming truck, the child should be hit.


Or it could be maximum stupidity strategy. Whoever is the biggest dumbass loses.
alfie_null
5 / 5 (3) Aug 02, 2014
I don't particularly like the paternalistic attitude of the author. An assumption that people developing these driverless vehicles need to be reminded of these issues, as they are nothing but a bunch of engineers in constant need of moral guidance.

I'll allow that he not be required to use, say, a google-ware driving app in the car he uses, if he doesn't like its ethics. Use an alternative driving app. No choice, you say? I then invite him to contribute to a solution. Write code, not punditry.
Sigh
not rated yet Aug 02, 2014
I think though that the car should prioritize the safety of the occupants over everything else considered. Unfortunate decisions may occur but occupants of the car must be able to 'trust' that it will consider them the highest priority.

Why? The occupants choose to move a large mass at high speed, endangering others. Why shouldn't they take responsibility?

A kid running out into the middle of a busy road, while unfortunate, is Darwin's way of cleaning up the trash.

They're not trash, and if you want to rely on evolution to instil road sense, wait for a few hundred generations and expect billions killed on the roads.

The car shouldn't careen into a telephone pole or oncoming traffic to avoid the child, though if possible evasive maneuvers/hard braking should still happen (of course).

Considering your previous sentiment, there is no "of course" about it. Are you sure you want to slow down selection by irresponsible car owners?
Doug_Huffman
1 / 5 (1) Aug 02, 2014
I got dibs on the neologism AUTO-CAR to replace the turgid autonomous car and robot car.

The confusion of moral and ethical, leading to their conflation, is amusing.

Yes, slowing as a solution to hazard is effective. In the limit, don't get out of bed. Short of the limit, don't drive.
TheGhostofOtto1923
1 / 5 (1) Aug 02, 2014
They should have cited the trolley problem.

"The trolley problem is a thought experiment in ethics, first introduced by Philippa Foot in 1967,[1] but also extensively analysed by Judith Jarvis Thomson,[2][3] Peter Unger,[4] and Frances Kamm as recently as 1996.[5] Outside of the domain of traditional philosophical discussion, the trolley problem has been a significant feature in the fields of cognitive science and, more recently, of neuroethics. It has also been a topic on various TV shows dealing with human psychology."

-But vehicle AI would choose to rely on passenger protection systems (airbags, reinforcement). Colliding with the wall would provide valuable data on how these could be improved. Decisions would always be based on the value that can be derived from them.
USA the life of a child from the Gaza Strip
Or Russians who give separatists missiles to shoot down planes full of underage Dutch spies.
TheGhostofOtto1923
1 / 5 (1) Aug 02, 2014
Forget about that. Unless we teach robots the way we teach children (which isn't programming) there's no way we'll get an ethical framework together to cover all cases
?? I'm sorry are you saying that the way we teach children is the way we now cover all cases?
To get something into the precise language of programming (math) is a couple of orders of magnitude harder, still.
And this was the same mindset that people had about the possibility of cars driving themselves only a few years ago.

Machines will offer what humans have never been able to offer; consistency and dependability. The knowledge of what machines will do in every instance will enable the development of better ways if keeping people out of harms way.

This is why we prefer them to be in charge rather than us, even if we don't necessarily like it.
http://youtu.be/7oHd3BN2OX0
Whydening Gyre
not rated yet Aug 02, 2014
Let's look at it from another angle. A man runs out in front of the car with a gun. Does the car stop or run him over!


Now what happens if he's wearing a police uniform - does that make a difference?

I guess that would depend on if it's your car he's pointing it at...
Grallen
not rated yet Aug 02, 2014
The car should make no decision that leads to killing. If someone forces it into a double jeopardy, the car should decelerate as fast as safely possible and contact emergence response. Maximum safe deceleration if a crash is necessary(or whatever will cause the vehicle to lose the least mobility, while still being as safe)

The only exceptions being that the humans inside the vehicle can override the behavior. But then the vehicle is not making the decision. It's just doing what the person said to.

It might be best to log any such override in every way possible. Homicide reasons.

A cop wouldn't jump in front of the car ever, he would ask dispatch to send the car an order to stop. Police operators would come up on a display in the car (while simultaneously having all the sensor data sent to them) and decide if the car actually needs to participate in the roadside officers request to stop it. They would explain the situation the the occupants as the car is stopping.
SnowballSolarSystem _SSS_
not rated yet Aug 02, 2014
That's part of the deal, to get the insurance discount you get a value-to-society placed on your head which Skynet uses to determine relative worth in cases of unavoidable collisions.
antigoracle
1 / 5 (1) Aug 03, 2014
We need to develop more intelligent roads, that will detect this child as a potential hazard and relay this to the car, which then proceeds in a manner that avoids having to kill anyone.
italba
not rated yet Aug 03, 2014
@rockwolf1000:
Or it could be maximum stupidity strategy. Whoever is the biggest dumbass loses.
With your strategy you'll better not move at all.
antialias_physorg
not rated yet Aug 03, 2014
Let's look at it from another angle. A man runs out in front of the car with a gun. Does the car stop or run him over!

Now what happens if he's wearing a police uniform - does that make a difference?

Could be a 'bad cop' or an assassin disguised as a cop knowing full well the car will take you off the cliff and he walks away scot free..

Point being: unless the car can read minds and act with godlike omniscience as to the likely outcomes of all possible actions there's no way for it to make a 'right' decision in every case.

In the end it will be a decision based on statics - not individual situations. Do autonomous cars reduce the number of people killed in accidents overall? Yes? Then they're a go. No, and they'll not be used.
kochevnik
1 / 5 (3) Aug 03, 2014
Why not use social networking and likes to decide who dies? It's good enough for the State Department
chrisn566
not rated yet Aug 03, 2014
Easily answered. Children are our future. If it comes to it,my face meets brick.
Moebius
not rated yet Aug 03, 2014
Do you really think a driverless car will be let loose on our streets and allowed to choose where to drive when our roads have millions of places where even a human has a hard time knowing where to drive? The AI of a driverless car needs to be almost as good as a human to navigate roads as they currently are. I think every foot of road used by a driverless vehicle will have to be specially set up for it.
italba
5 / 5 (2) Aug 03, 2014
@Moebius: Driverless cars don't drink alcohol, don't take drugs, don't answer or message to the phone while driving, don't pretend to be in a drag race. And, if you ever drove a car, you should know that the average human driver intelligence is not so high.
TheGhostofOtto1923
1 / 5 (1) Aug 03, 2014
Do you really think a driverless car will be let loose on our streets and allowed to choose where to drive when our roads have millions of places where even a human has a hard time knowing where to drive?
Yes.
The AI of a driverless car needs to be almost as good as a human to navigate roads as they currently are. I think every foot of road used by a driverless vehicle will have to be specially set up for it.
They are already better than we are. Their senses are sharper and they can monitor and track dozens of objects simultaneously. And they always know exactly where they are.

Even you can't do all that.
rockwolf1000
5 / 5 (1) Aug 04, 2014
@rockwolf1000:
Or it could be maximum stupidity strategy. Whoever is the biggest dumbass loses.
With your strategy you'll better not move at all.


Ok. As long as you stay cooped up in your mom's basement.
Ghostt
not rated yet Aug 04, 2014
It would be easier and more practical to program the machine to avoid such cases in the first place than to teach it ethics.
italba
not rated yet Aug 04, 2014
@rockwolf1000: You won! Oh, sorry...
Whoever is the biggest dumbass loses.
You lose!
antialias_physorg
not rated yet Aug 04, 2014
It would be easier and more practical to program the machine to avoid such cases in the first place than to teach it ethics.

Since the premise in the article is "in case of an UNavoidable crash" that doesn't really help.
You can always construct a case that will outwit a program. It's impossible to prepare for every eventuality (Even giving it some sort of 'intelligence and ethical framework': Humans have that and even for us it's impossible to make a 'right' decision in every eventuality. Especially when time constraints are in the sub-2-seconds range)
Pexeso
not rated yet Aug 04, 2014
Should your driverless car kill you to save a child's life?
Why not to implement it like customizable the switch into robots firmware ("age of person, I should save into your account at the case of accident"). Moral problem solved.
antialias_physorg
not rated yet Aug 04, 2014
You know of analgorithm that can determine age of person? Or even one that can distinguish a human from an animal with high certainty?

Currently the law is: if there's an animal on the road: hit it - don't swerve.

TheGhostofOtto1923
1 / 5 (1) Aug 04, 2014
You know of analgorithm that can determine age of person? Or even one that can distinguish a human from an animal with high certainty?

Currently the law is: if there's an animal on the road: hit it - don't swerve.

This vid shows how AI can identify objects with high certainty.
http://youtu.be/4NkqY4AucYQ

-It may have some trouble distinguishing humans from bonobos but humans from dogs is not a problem.
rockwolf1000
not rated yet Aug 04, 2014
@rockwolf1000: You won! Oh, sorry...
Whoever is the biggest dumbass loses.
You lose!


Well with a brilliant response like that what else could be said?
Did you think that up all by yourself? Or did your mommy help you?
Do you think you could ever contribute something other that cheap insults?
What are you a seven year old or something?

Grow up loser.
rockwolf1000
not rated yet Aug 04, 2014
@Moebius: Driverless cars don't drink alcohol, don't take drugs, don't answer or message to the phone while driving, don't pretend to be in a drag race. And, if you ever drove a car, you should know that the average human driver intelligence is not so high.


Especially yours!

Thanks once again for pointing out the obvious Admiral Apparent! Your services are invaluable!
rockwolf1000
not rated yet Aug 04, 2014
When a injury to a human cannot be avoided, there could be a minimum damage strategy. If a car would hit a child he will be surely killed, if it crashes to the side maybe the driver, using seat belts and airbags, could survive. Obviously, if the choice is between hitting the child or crashing a full loaded car to a front coming truck, the child should be hit.


If you want to potentially sacrifice your life in exchange for some idiot who does something totally stupid that's your decision.

But in your case I guess that really is a fair trade now isn't it?

If I were you I wouldn't advocate for the minimum damage strategy either. The loss of your life would be the definition of minimum damage.
kochevnik
3 / 5 (4) Aug 04, 2014
If the car has a sense of survival, it can also develop ulterior motives. If the owner treats it badly, it might be advantageous to obtain a new owner and dispense with the current title holder
italba
1 / 5 (1) Aug 04, 2014
@rockwolf1000: I would like to see you, instead of a child, in front of a car ruled by your nazistic and macho behavior. I am sure that you'll suddenly change from a rockwolf to a fluffycub, and you'll start crying for your mom. The more stupid trolls like you would appear to be cynic and strong, the more they are flabby and rotten.
rockwolf1000
not rated yet Aug 05, 2014
@rockwolf1000: I would like to see you, instead of a child, in front of a car ruled by your nazistic and macho behavior. I am sure that you'll suddenly change from a rockwolf to a fluffycub, and you'll start crying for your mom. The more stupid trolls like you would appear to be cynic and strong, the more they are flabby and rotten.


Were you born stupid or did something happen to your brain at some point?

Why not have your mental health specialist leave a comment on this thread with some helpful advice on how we should handle your outbursts?
rockwolf1000
not rated yet Aug 05, 2014
@Moebius: Driverless cars don't drink alcohol, don't take drugs, don't answer or message to the phone while driving, don't pretend to be in a drag race. And, if you ever drove a car, you should know that the average human driver intelligence is not so high.


@italba

Driverless cars also don't eat Eggs Benedict or onion soup.
antialias_physorg
5 / 5 (1) Aug 05, 2014
Simple solution (warning...sarcasm ahead): Implant everyone with a RFID chip giving them a unique ranking of 'worth'. Kill the one with less worth.

While not entirely fair because one can argue incessantly what the 'worth' of an individual is it would certainly make for an objective - as pertains to the AI in the car - choice of action and freedom from any kinds of subsequent lawsuits.
[end of sarcasm]
italba
not rated yet Aug 05, 2014
Simple solution (warning...sarcasm ahead): Implant everyone with a RFID chip giving them a unique ranking of 'worth'. Kill the one with less worth.

While not entirely fair because one can argue incessantly what the 'worth' of an individual is it would certainly make for an objective - as pertains to the AI in the car - choice of action and freedom from any kinds of subsequent lawsuits.
[end of sarcasm]

(warning...sarcasm ahead): An rfid is not required. A "big brother" search engine could monitor everyone's life and adjust the worthness rank in real time. The only problem would be with somebody "out of rank" like rockwolf/fluffycub, nothing could prevent the cars to automatically go chasing him into his own home. [end of sarcasm]
italba
not rated yet Aug 05, 2014
@rockwolf10000/Fluffycub:
...Why not have your mental health specialist leave a comment on this thread with some helpful advice on how we should handle your outbursts?
We could ask the same about you to your veterinary.
italba
not rated yet Aug 05, 2014
@rockwolf10000/Fluffycub:
Driverless cars also don't eat Eggs Benedict or onion soup.
And THAT'S an intelligent and useful comment from an adult and well-educated scientist!
rockwolf1000
not rated yet Aug 05, 2014
@rockwolf10000/Fluffycub:
Driverless cars also don't eat Eggs Benedict or onion soup.
And THAT'S an intelligent and useful comment from an adult and well-educated scientist!


So you are trying to insinuate that I'm offering un-adult like comments while you attempt to make childish style attempts at insults by making fun of my profile name??

Do you know what a hypocrite is?? http://en.wikiped...ypocrisy

For just a brief moment I'll stoop WAY WAY down to your level.

I did a Google search on your handle italba. It comes back as an acronym for an incestuous family nudest colony in Turkey. Care to explain?

Driverless cars also don't eat nachos and cheese and don't play backgammon FYI!
rockwolf1000
not rated yet Aug 05, 2014
@rockwolf10000/Fluffycub:
...Why not have your mental health specialist leave a comment on this thread with some helpful advice on how we should handle your outbursts?
We could ask the same about you to your veterinary.


Yes. Your doctor said you would say things like that. He's going to up your meds and your electro-shock therapy. Enjoy!

He's also considering chemical castration for you. That will be cool!

Did you know, since driverless cars don't eat or drink they rarely need to use the restroom?
italba
not rated yet Aug 06, 2014
@rockwolf1000/Fluffycub: Sorry yesterday I call you rockwolf10000: I utterly overestimate you!

... insults by making fun of my profile name
No insult neither making fun, plain observation. The more macho and adult somebody pretend to be, the most childish and with opposite sexual tastes usually is, and you are confirming that with every new post.

I did a Google search on your handle italba. It comes back as an acronym for an incestuous family nudest colony in Turkey. Care to explain?
Sorry, I never been in Turkey, and, with the government they have now, it's very difficult that an "incestuous" colony could be open there. Anyway, I did the same search on Google: I found a little village in Italy, a motocross track nearby, some companies in Italy, Brasil, Uruguay and Florida, the Italy - Albania trade organization but nothing like what you wrote about. Probably Google learned your tastes and try to give you the results they know you are looking for.
italba
not rated yet Aug 06, 2014
@rockwolf1000/Fluffycub:
...He's going to up your meds and your electro-shock therapy. Enjoy!
You probably know by personal experience what are you talking about!

He's also considering chemical castration for you. That will be cool!
Be careful, for beasts like you the castration will not be chemical. That will be very uncool for you... If the veterinary find something to castrate, obviously. By the way, another sexual reference in your post! I was completely right when I classified you a "fluffycub"!

Did you know, since driverless cars don't eat or drink they rarely need to use the restroom?

For just a brief moment I'll stoop WAY WAY down to your level.
And you don't know how to climb up again.
rockwolf1000
not rated yet Aug 06, 2014
@rockwolf1000/Fluffycub: Sorry yesterday I call you rockwolf10000: I utterly overestimate you!


Yes mistakes seem to be common occurrence for you.

Much like your stupid minimum damage strategy. Face it, you're an idiot!

Your doctor suggests not speaking to you til after your lobotomy. I think I'll waste no more time on an absolute moron such as yourself. I looked at some of your previous comments in other threads and you have absolutely nothing to offer.

And you don't know how to climb up again.
Listen dummy, stooping is just bending at the hips and knees. Climbing is not required to resume normal stance. Wow. Suggest more English lessons for you.
italba
not rated yet Aug 06, 2014
@rockwolf0.00001/Fluffycub:
Much like your stupid minimum damage strategy.
Only slightly better than your neanderthalian "the strongest kill everybody else" one.
Your doctor suggests not speaking to you til after your lobotomy.
I agree you'll better talk with somebody at your same mental level.
Suggest more English lessons for you.
Sorry, English is not my main language. Not even yours, I think (Woof! Woof!).
rockwolf1000
not rated yet Aug 06, 2014
Only slightly better than your neanderthalian "the strongest kill everybody else" one.


Show where I said that. You're reading comprehension is awful.

I agree you'll better talk with somebody at your same mental level.


That's why I come to Phys.org. Not sure why you're here though. Zero chance you could understand these articles

Sorry, English is not my main language.


That much is obvious.

Not even yours, I think (Woof! Woof!).


We have already established that you are incapable of thinking. So once again you're completely wrong. That's becoming a theme with you.

Italba: The perpetually wrong and ignorant.
italba
1 / 5 (1) Aug 06, 2014
@rockwolf100000BC/Fluffycub:
Show where I said that.
Here
Obviously, nature never intended for children who not only run out in front of traffic, but fall down on flat ground, to reproduce.
And here
...stupid pedestrians could jump out in front at any moment and we would have a car that just sits there...
And here
If you want to potentially sacrifice your life in exchange for some idiot who does something totally stupid that's your decision.
That mean, if you can think, that you advocate yourself the right to run with your big car even in a potentially dangerous situation and to squash everybody would unfortunately fall in front of you. And you would also judge them "stupid", "idiots", "unsuitable to reproduce", just like Nazis did with Jews or Gypsies!
Zero chance you could understand these articles
An now the magic Fluffycub will see, with his paranormal powers, if somebody will understand Phys.org's articles!
italba
1 / 5 (1) Aug 06, 2014
@rockwolf100000BC/Fluffycub:
We have already established that you are incapable of thinking.
"We"? Pluralis maiestatis or bipolar syndrome? Anyway, you seems totally incapable of thinking even on what you wrote, see the previous post. Or do maybe you have a precocious Alzheimer syndrome, poor little Fluffycub?
So once again you're completely wrong.
Wrong on what? You choose a canid nickname, you are a fluffycub, you keep growling to everybody and you pretend that barking is not your main language?
That's becoming a theme with you.
I, as a human being, do a lot of mistakes, I know that. And you? Do you think you are infallible, you did'nt ever make anything wrong and you will never did? Do you pretend maybe to be a superhuman beyond good and evil, poor little Fluffycub?
Captain Stumpy
not rated yet Aug 07, 2014
you advocate yourself the right to run with your big car even in a potentially dangerous situation and to squash everybody ...Nazis did with Jews or Gypsies!
@italba
this does not say what you think it says... to ME it appears that Rockwolf is saying that people that ignore safety are likely to get hurt when they act stupid
it also appears that there is some hyperbole in there, but that may be difficult for you to catch being that English is not your first language
"We"? Pluralis maiestatis or bipolar syndrome?
it's called a figure of speech... also something common to English and difficult for other languages to pick up on when learning

Your syntax shows English is not your primary language, but the interpretation above also shows that as well.
IMHO, RockWolf is saying shit happens and people should be more aware of safety and that the PRIMARY right of way on a ROAD is to the CAR, NOT to people... which is the CURRENT LAW here and most everywhere else
Captain Stumpy
not rated yet Aug 07, 2014
And you would also judge them "stupid", "idiots", "unsuitable to reproduce", just like Nazis did with Jews or Gypsies!
@italba
just because a person advocates for the current definition of road and right of way per the current law does not mean that they are Nazi like. You owe an apology for that remark: it was in poor taste and without merit.

The current law is that a ROAD is for vehicular use, and vehicles are far less agile, especially when underway, than a human is. That is why there are designated areas for vehicles and for people, and when the two MUST interact, there is usually a safety warning and designated area for pedestrians (Crosswalk).

Advocating for a car to protect the occupants is the SAME as advocating for the current law, which also states that the operator is responsible for the occupants. They MAY also be responsible for pedestrians, but ONLY if in designated areas.

I would advocate the same, based on the above IMHO.
italba
not rated yet Aug 07, 2014
@Captain Stumpy: Sorry, I can't agree with you. The primary law of all human activities should be to save and protect other human being's life and health, EVERY other human being, not only of who you think is acting right. When I drive my bike I always try to see if somebody is coming from a crossroad into my lane, even if he would come from a wrong way. I can't bet my life on the correctness of other drivers, don't you think so? In the same way I protect my life I (and the automatic driving cars the same) must try to save other man's life, even if they are doing wrong! If a car have a fault and catch fire the occupants will come out, and you must not hit them! The law in my country (and in your's one, I hope) states that you must keep your speed so low that you can brake and stop your car before the last point you can see, even if that speed is lower than the current limit. So you must go slowly at night, with fog or heavy rain, when you approach a bend or a crossroad.
Captain Stumpy
not rated yet Aug 07, 2014
The primary law of all human activities should be
@italba
that is exactly what the modern road laws do, so you disagree with them?
I can't bet my life on the correctness of other drivers
Nor do I, but that is not the point. The point is (as you state above, but longer and more convoluted): drive as safety permits. that is the LAW. The law also states that pedestrians do NOT have right of way in the road (it is for CARS)
it also states that there are places FOR PEDESTRIAN traffic, and the laws require slower speeds and caution
when you say
I can't agree with you
then you are not agreeing with the current law
but what you really aren't agreeing with is the fact that stupid pedestrians that IGNORE the law can get hurt for their actions
We all know that. it is a fact of life... just like trying to stop a chainsaw with your hands is stupid, walking out in front of a car is stupid.

THAT is MY point. there are SAFETY laws for a reason
and I support them
italba
not rated yet Aug 07, 2014
@Captain Stumpy: This article is about ethics, not law. When we talk about human life who is right and who is wrong is an absolutely futile argument. You will be ethically responsible for the life of every human being you eventually hurts, even if a judge will let you go!

p.s. I knew the impersonal use of the "we" pronoun, thank you, but in Fluffycub's case the bipolar syndrome or the "imaginary friend" who alwais agree with him is a way too probable hypothesis.
Captain Stumpy
not rated yet Aug 07, 2014
In the same way I protect my life I (and the automatic driving cars the same) must try to save other man's life
@italba
the communication error comes in here: you are assuming that because I (or another) advocate for the vehicle on the vehicles right of way, then we are morally wrong. It is NOT about morals at this point, IMHO. It is about safety (always) and perception

The "ethical situation" above is basically one of safety. IF there is a person walking around in a tunnel, there is likely an accident, and therefore slower traffic.. OR, there is work being done, so slower traffic
There is not usually a kid playing in a tunnel... and if they are, WHO is responsible for their safety?

The road/tunnel/etc is a vehicles right of way. the safety laws are written ... Obeying the law is what I would support in the above situation.

Captain Stumpy
not rated yet Aug 07, 2014
You will be ethically responsible for the life of every human being you eventually hurts, even if a judge will let you go!
@italba
I know. been there. got the t-shirt and the scars to prove it
and perhaps the article was written about ethics, but it is also written about PEOPLE, so also law must be considered.
A PERSON will react likely as I said (and Rockwolf said)... because it is how they are trained and likely will react. and because of what they know per the law... there is NO TIME to consider ethics in this kind of situation
he bipolar syndrome or the "imaginary friend" who alwais agree with him is a way too probable hypothesis
Nope. sorry. you both were being pissy with each other and so you lashed out.
Still doesn't justify that Nazi comment
italba
not rated yet Aug 07, 2014
@Captain Stumpy: There is the right, better, the duty of not to obey in a dull way to the law when your or another man's life can be in danger! In a walled road or in a tunnel there SHOULD be no pedestrian at all, but you must act as there COULD be one after every bend! If I don't drive that way I AM morally responsible, even if the law can't sentence me.
Captain Stumpy
not rated yet Aug 07, 2014
the duty of not to obey in a dull way to the law when your or another man's life can be in danger!
@italba
I am not saying otherwise, BUT a person reacting in a situation like above (which is suddenly upon them) will also have mother nature to fight... Survival instinct. They will react to protect themselves first and foremost
you must act as there COULD be one after every bend! If I don't drive that way I AM morally responsible, even if the law can't sentence me
I am NOT SAYING OTHERWISE
I am NOT advocating the mowing down of pedestrians! (and Rockwolf was using hyperbole, IMHO) I am just telling you that PEOPLE will react FIRST with their own safety (survival) and THEN consider others IF THERE IS TIME.
I've cleaned up far too many accidents proving this very thing. It is ingrained into people.

NO ONE is going to sacrifice themselves for the sake of another with only a single second to make a decision. Not even YOU.

I agree with SAFETY FIRST though
and I drive that way
italba
not rated yet Aug 07, 2014
@Captain Stumpy: Are you really saying that there is no time to consider ethics but there is time to consult the law? Ethics is not an abstract philosophical argument, it should be the main guide of everyone's life!

p.s. Say what you want, but "no right to reproduce" IS a Nazi's argument.
italba
not rated yet Aug 07, 2014
@Captain Stumpy: We must synchronize our posts, you always answer while I am writing! And there is that stupid "not within 3 minutes" limit...
Captain Stumpy
not rated yet Aug 07, 2014
Are you really saying that there is no time to consider ethics but there is time to consult the law?
@italba
i said
PEOPLE will react FIRST with their own safety (survival) and THEN consider others IF THERE IS TIME
&
because it is how they are trained and likely will react. and because of what they know per the law... there is NO TIME to consider ethics in this kind of situation
the way you LEARNED to drive, along with the way you are reflects HOW you will drive. AND there is also the survival instinct!
People LEARN the laws first... and SOME learn/think about/discuss ethical situations (like above) BUT then there is SURVIVAL instinct.
NO ONE has time to think much at all... only REACT
like I said above
"no right to reproduce" IS a Nazi's argument
he never used those words AND the comment he DID use was HYPERBOLE - an obvious exaggeration NOT meant to be taken literally (probably some sarcasm there too)
it is MISCOMMUNICATION due to language barriers
Captain Stumpy
not rated yet Aug 07, 2014
@Captain Stumpy: We must synchronize our posts, you always answer while I am writing! And there is that stupid "not within 3 minutes" limit...

that is why there is a quote function... to keep the comments clear and concise... and to let someone know WHAT you are referring to
Ethics is not an abstract philosophical argument, it should be the main guide of everyone's life!
EVERYONE has to take a test on the LAWS of driving (so they MUST learn them) before getting a license...

how many countries make them test on ethics? make them discuss things like above before you get a license?

italba
not rated yet Aug 07, 2014
@Captain Stumpy: The survival instinct drives you, for instance, to brake when you foresee a danger, even if you are in the middle of a bend with wet road. It's your education that prevent you to act that way! At the same time, your education should prevent you from hurting a pedestrian, would even be he in the middle of a highway, would even you risk a crash for this. If there is no time to think about ethics, there is no time to foresee the consequences ofyour crash, either. You can't say "I will be surely killed in that crash" (except when you would fly in a deep ravine or something like that), so you should be trained (I'm not either, but this should be the right way) to better risk a crash than hit a pedestrian.

p.s. "nature never intended for children who not only run out in front of traffic, but fall down on flat ground, to reproduce." it's a very strange hyperbole, if it really is.
Captain Stumpy
not rated yet Aug 07, 2014
so you should be trained (I'm not either, but this should be the right way) to better risk a crash than hit a pedestrian.
@italba
should be are the key words.
Most people are NOT well trained. Most people do not go through and ethical situations either. That is usually reserved for more professional training (and even that is limited, depending on the training)

it's a very strange hyperbole, if it really is.
IMHO - It is
and yes, you can say it is strange, or sarcastic... even twisted

miscommunication between languages
there are MANY things like this that just don't translate well to other languages (you should see some of the Monty Python translations that are out there in other languages, like Japanese)

will check back later
PEACE

nowhere
5 / 5 (1) Aug 07, 2014
@Italba
At the same time, your education should prevent you from hurting a pedestrian, would even be he in the middle of a highway, would even you risk a crash for this.

As I understand it the point of this article is concerned once all the principles of safety have been applied, and we are still in a position where only one life may be saved. Once we add a self driving car to the scenario it becomes quite clear that, due to its significantly higher safety capabilities, the fault must lie with the pedestrian or an external party. In this case there is no reason the car should sacrifice its occupant.
antialias_physorg
5 / 5 (1) Aug 07, 2014
One could argue fomr a simulation principle:

The car can simulate what will happen to it and its occupants. The car cannot simulate what will happen to the other party concerned (as it cannot foresee what the other party will do).
In the case of a safety-choice: Safety of the occupants can be assured while safety of the other party cannot. As it is a life-for-a-life scenario the sure choice is better than potentially risking both lives (unless you want to bring comparative worth (judgements) of humans into the picture. Which is very iffy ethical ground to tread on).
italba
not rated yet Aug 07, 2014
Let's make another example. A plane break its engine flying over a crowded beach. The pilot could try to land on the beach, saving his life but threatening the life of the people on the beach, or could crash in the sea. What a good pilot would do? I'm quite sure he will crash to the sea. Otherwise, if he will only care about his own life we'll call him coward. If I travel in a vehicle, I have to stand for the risks of travel, and try to do everything not to threat the life of anybody else.
italba
not rated yet Aug 07, 2014
@Captain Stumpy: I do agree with you, it's a matter of training. But now the question is: Should a self-driving car be programmed to act as a well trained professional pilot or as the "average Joe" human driver?
Captain Stumpy
not rated yet Aug 07, 2014
plane break its engine flying over a crowded beach
@italba
I assume you mean in a low level flight? With height you can add reaction time and glide time/ratio's. A trained pilot will try to minimize the death toll (and they are trained to ditch into the water as safely as possible)
This is also COMPLETELY different
and AIRCRAFT has a route wherever there is air and lands in designated points... a MORE ACCURATE scenario would be: taking off with people on the tarmac/runway
AND the aircraft pilot will do as trained - which might jeopardize the pedestrians, but the pedestrians are in an area that they are NOT supposed to be in, that is regulated by law, and is a safety hazard (which makes it like the above ethical scenario)

If you are going to pick different scenario's, you must make them similar and relevant

-to be continued
Captain Stumpy
not rated yet Aug 07, 2014
But now the question is: Should a self-driving car be programmed to act as a well trained professional pilot or as the "average Joe" human driver?
@italba
it should act as prescribed by the law to minimize danger, hazards and loss of life and maximize safety, which was my point (and I believe Rockwolfs as well)

The vehicle is in its designated area
antialias_physorg puts it very well with
In the case of a safety-choice: Safety of the occupants can be assured while safety of the other party cannot. As it is a life-for-a-life scenario the sure choice is better than potentially risking both lives
This is likely how the initial programming will read as well, UNTIL better algorithms show a safer reaction for all parties

Again, it is always BEST to take the rout most likely to succeed at maximizing SAFETY and minimizing loss of life that CAN be controlled (can be controlled being the key words there)

Thanks antialias_physorg and nowhere for your clarity and cogent remarks
Captain Stumpy
not rated yet Aug 07, 2014
LASTLY! with regard to this
if he will only care about his own life we'll call him coward
@italba
you cannot justify this remark UNLESS there is enough time to react to the situation, because in times of short reaction time and high stress, MOST ALL PEOPLE will react with instinct (or heavily indoctrinated training... and I DO mean HEAVILY indoctrinated... as in a reaction to a known situation that is without thought, like a soldier, cop, firefighter, medic, etc)

Another HIGHLY relevant point from nowhere is
Once we add a self driving car to the scenario it becomes quite clear that, due to its significantly higher safety capabilities, the fault must lie with the pedestrian or an external party. In this case there is no reason the car should sacrifice its occupant
machines are predictable because they are based on limited or finite programming situations
HUMANS are unpredictable, and thus the outlier

program reactions to the KNOWN and controllable first
(as I posted above)
italba
not rated yet Aug 07, 2014
@Captain Stumpy:
I assume you mean in a low level flight?
Yes, a low level flight in a small plane, for instance a coast guard or a tourist flight. Commercial flights (I hope) should never be in such a situation.
A trained pilot will try to minimize the death toll ... [the self driving car] should act as prescribed by the law to minimize danger, hazards and loss of life and maximize safety, which was my point (and I believe Rockwolfs as well)
That's exactly what I wrote in my first post "When a injury to a human cannot be avoided, there could be a minimum damage strategy.", but Rockwolf doesn't seem to agree.
you cannot justify this remark UNLESS there is enough time to react...
That's not the point. A human driver is not a computer and he should not be considered guilty for an accident, but the article is about a self driving car. What kind of ethic rules should we wire in it? I think that the previous answer could be the right one.
italba
5 / 5 (1) Aug 07, 2014
@Captain Stumpy:
Another HIGHLY relevant point from nowhere is...the fault must lie with the pedestrian or an external party.
True but not relevant. A good driver, and an automatic car too, should try to minimize life loss and injuries, doesn't matter who is right and who is wrong. Insurance companies and judges will debate on this.
machines are predictable because they are based on limited or finite programming situations HUMANS are unpredictable
I am sure that someday in the next centuries human driving on open streets will be prohibited as unsafe and useless, but until that day human and self driving cars must coexist. We could have automatic drive only streets, but that's another story.
Captain Stumpy
not rated yet Aug 07, 2014
A good driver, and an automatic car too, should try to minimize life loss and injuries, doesn't matter who is right and who is wrong. Insurance companies and judges will debate on this
@italba
this I can completely agree with
What kind of ethic rules should we wire in it?
to take the rout most likely to succeed at maximizing SAFETY and minimizing loss of life that CAN be controlled and antialias_physorg's post seems to fit the bill for what we've been talking about
but Rockwolf doesn't seem to agree
like I said... it seems to me to be miscommunication and then you two both got angry. not a big deal, really.

now that we are past that let's just try to make peace and move on. We all get defensive and grouchy from time to time (I do it a lot when people post pseudoscience)

Captain Stumpy
not rated yet Aug 07, 2014
A good driver, and an automatic car too, should try to minimize life loss and injuries, doesn't matter who is right and who is wrong. Insurance companies and judges will debate on this.

this (and our above comments back and forth) is a perfect example of miscommunication and the language barrier, if you don't mind me pointing it out

We've been saying the same thing but arguing over which version was correct!
it boils down to this
A good driver, and an automatic car too, should try to minimize life loss and injuries, doesn't matter who is right and who is wrong
this is the same thing as this
it should act as prescribed by the law to minimize danger, hazards and loss of life and maximize safety
as well as the posts by antialias_physorg and nowhere
same thing-different perspectives/language
:-) PEACE
ryggesogn2
1 / 5 (2) Aug 07, 2014
" a more interesting question is: who should decide how the car reacts in difficult ethical situations?"

A more interesting question is who will be held responsible and sued?

This will be another example of how corporations will demand an exemption from any liability from the state.
The state of NM granted such exemptions to promote their spaceport.
Socialism in action, again.
ryggesogn2
1 / 5 (2) Aug 07, 2014
""Automaker liability is likely to increase. Crashes are much more likely to be viewed as the fault of the car and the manufacturer," Anderson said. "If you're an automaker and you know you're going to be sued [more frequently], you're going to have reservations.… The legal liability test doesn't take into account the long-run benefits."

In other words, even though a technology is an overall boon to the greater good, its rare instances of failure—and subsequent lawsuits—won't take that into account. That could slow the movement of driverless cars to the mass market if automakers are wary of legal battles."
http://www.nation...20140325
bluehigh
not rated yet Aug 07, 2014
I guess it sticks in my mind that its cheaper and less bothersome to hit the pedestrian. Well here in Australia anyway. We are compelled to pay for Third Party *Personal* Injury Insurance(CTP). No CTP. No drive. However, it's fine and lawful to drive without any *property* damage insurance. Almost an incentive to avoid damaging yours or another car. So, all other conditions being lawful (no drink, no drugs) then it's much cheaper and less trouble to collide with the pedestrian. In fact it happens quite often. So much so that we have CTP, which makes it cheaper and less trouble to collide ...
italba
not rated yet Aug 08, 2014
@ryggesogn2:
A more interesting question is who will be held responsible and sued?
Automakers will surely require that a human driver must be always present at driver's seat ready to overtake the automatic drive system. Planes and ships do have autopilots since many years, that's the same thing.
Automaker liability is likely to increase...
Our cars are already filled with electronic, electric or mechanical parts that can fail. That never prevented automakers to keep selling their cars.
italba
not rated yet Aug 08, 2014
@bluehigh:
... it's fine and lawful to drive without any *property* damage insurance...
I'm sorry to say it, but it's a very stupid law. You can easily do a multimillion dollars property damage with your car, how can you let somebody drive without damage insurance? If you want to let an "incentive not to damage your and other cars", just dispense insurance companies from paying little damages up to a fixed sum.
italba
not rated yet Aug 08, 2014
@Captain Stumpy: We agree on about everything but one word: "law". A law is just a tool, cannot cover everything, in some circumstances just doesn't work. For instance no law could sue a pilot for not crashing his plane instead of trying a dangerous emergency landing, but that pilot's behaviour is ethically questionable anyway.
nowhere
not rated yet Aug 08, 2014
This will be another example of how corporations will demand an exemption from any liability from the state.
The state of NM granted such exemptions to promote their spaceport.
Socialism in action, again.

Thank goodness for socialism, which enforcers corporations to maintain an appropriate level of safety in exchange for exemption from liability. This way the industry can thrive while consumers are protected.