Hacker News new | past | comments | ask | show | jobs | submit login

Reaction time isn't the problem. Cars can't make moral decisions.

It's not uncommon that a pedestrian will jump in the road (sometimes intentionally to commit suicide). This happens to me personally on a daily basis in San Francisco. Sometimes the driver has a moral choice to ether run the car off the road killing people on the sidewalk or killing the pedestrian.

What do you expect a Self-Driving car to do in that situation?




"Cars can't make moral decisions."

There are moral problems that I'd be uncomfortable putting to a computer.

This isn't one of them. Am I missing something, is there something wrong with simply modeling likely impacts and trying to cause the least harm to the fewest number of people? That's what I would do, only with worse reaction time and less information.


Wait .. what ..? People try to kill themselves under your car daily? Are you confusing reality with GTA?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: