Hacker News new | past | comments | ask | show | jobs | submit login

> Generalized understanding of the world.

LLMs definitely have this, and it really is bizarre to me that people think otherwise.

> Cannot continually learn and incorporate that learning into their model.

This is definitely a valid criticism of our current LLMs and once we (further) develop ways to do this, I think my main criticism of LLMs as AGI will go away

> Cannot reason on any deep level.

Few people are able to do this

> Cannot engage in unprompted reflection in the background.

True, but I don't know if that belongs as a requirement to be AGI.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: