Hacker News new | past | comments | ask | show | jobs | submit login

What is the decision? The car will follow exactly the rules of the road. If the car behind it continues to barrel forward, that driver did not allow for adequate stopping distance. In this scenario, that driver is at fault.

Of course that doesn't say anything for the safety of the child, but just as we don't expect humans to be able to account for every single scenario, neither can we require a computer to do so.

For every one of these supposed ethical dilemmas, replace the computer with the most pedantic driver imaginable: this person has literally memorized the driver's manual. Ask that person how they would respond in that situation. That is how the computer would respond. There are no choices to be made: the choices have already been made by the legislators who enacted the driving laws.

I imagine the first set of self-driving cars will require a huge bumper sticker (like the ones they put on learner cars: "STUDENT DRIVER") that indicates this car will not exceed the speed limit and will not deviate from the rules. Everyone else can adapt accordingly.




There's an implicit decision there - choosing between you (the passenger in the autonomous vehicle) and the people on the street. You can't have your cake and eat it too. Fault isn't what's at stake here.

You actually DO require the computer to account for every single scenario - since we DO require humans to do so. Nobody says "Oh, it's okay that they <<insert some negative driving consequence>> - nobody expected that they'd be able to handle it!"

Legislators with driving laws aren't how people interact with roadways, except in a basic sense of a framework.

I do agree with the self identification of autonomous vehicles, especially if they replace LIDAR. However, there are already highway capable autonomous systems that are driving around right now (Teslas are an example). There's just no test case yet.


>There's an implicit decision there - choosing between you (the passenger in the autonomous vehicle) and the people on the street.

Does the law ever require a person to break the law? That is the decision you've proposed the computer make. This is a serious question, not a pithy comment. Any action outside of immeditately stopping breaks the law: swerving into another lane involves changing lanes without signaling or entering into oncoming traffic. Swerving into the sidewalk breaks the law as well.

I contend that we do very often say "it is unfortunate this situation but you complied with your instincts as they have been developed within the confines of traffic safety law."

I also contend that fault is at stake and only fault can be considered at stake. We do not expect people to correct other humans' mistakes by breaking the law, correct? So why do we expect a computer to correct for all human's mistakes? There will certainly be tragedies involving autonomous vehicles. But to count that as a mark against the technology is holding it up to a standard we have never applied to any other technology.

The computer merely acts in accordance with traffic laws. That there are two possible outcomes (passenger injury and pedestrian injury) does not imply that a choice has been made. In fact, those are not the only two outcomes, we've merely whittled it down to those for the sake of discussion.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: