Hacker News new | past | comments | ask | show | jobs | submit login

Realistically though, a super intelligent AI, would require far more fundamental breakthroughs. People who say otherwise, should take a look at problems in Control theory.

The effect of scaling things up should be interesting though. Perhaps there is a "Phase transition" like thing, where suddenly something awesome happens. This also means that, sadly, Universities would no longer be able to provide adequate resources for research.




What makes you say that? It may seem that it would take a lot of work, but really, how can we know? Often times difficult problems seem obvious in retrospect.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: