Hacker News new | past | comments | ask | show | jobs | submit login

If we adopted really strict adherence to that rule as the bar to research, there would be no scientific progress at all. I'm not convinced that would be a desirable thing.



well, the clever solution here is not to demand a stop to all AI research, but rather to speed it up to reduce the chance that a single bad actor will get too far ahead... i.e., to get "post-singularity" ASAP, and safely.

Definitely bold... might be just crazy enough to work! Would love to see the arguments laid out in a white paper.

Reminds me of the question of how far ahead in cryptology is the NSA compared to the open research community.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: