Hacker News new | past | comments | ask | show | jobs | submit login

I’m on the opposite end of the spectrum. I’m almost certain that this is going to end extremely badly for the majority of humanity, and for programmers in particular.

I think there’s a less than 5% chance that this goes well, and that’s only if we get a series of things to go extremely well. And frankly, we’re tracking along the extremely bad path so far.




We barely survived one nuclear arms race, and this could give every nation state a new type of power weapon every 5ish years through the inevitable scaling in energy and weapons. I agree we're on one of the worst timelines for AI/AGI/ASI with the world actively being run into the ground by short sighted 'dementia-ocracies' and every security risk about to increase dramatically.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: