Hacker News new | past | comments | ask | show | jobs | submit login

I think the fail safe human laws are only useful to license research.

For day to day driving, the cars should either be fully autonomous or have the automated systems limited to intervening in dangerous situations (like current brake priming systems and the like). If the driver is able to take (primary) control, they should be required to be in control.




Tricky. There may be situations where the car stops and says "I can't do this" at which point primary control is handed over to the driver. Think about the system being damaged or obstructed, or a protest rally staring in the street up ahead. The driver should be able to take over, maybe after a stop.

Given this "stop, your turn" option, IMHO it would be horrible to also deny the driver from taking over during normal driving conditions, seeing as though computers will certainly make mistakes. Allowing that option to exist, however, should not equate to the human 'driver' being liable for not taking control when the computer failed.

Basically, you should give passengers an emergency stop option and not make them liable for not using it.


My thinking about this is based on humans being very bad at handling situations where they don't really need to pay attention. I think if the system is only good enough that an attentive driver/passenger can use it well, then it needs to be made better before it is licensed for everyday use.

The stop to change control thing makes sense to me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: