LLMs definitely have this, and it really is bizarre to me that people think otherwise.
> Cannot continually learn and incorporate that learning into their model.
This is definitely a valid criticism of our current LLMs and once we (further) develop ways to do this, I think my main criticism of LLMs as AGI will go away
> Cannot reason on any deep level.
Few people are able to do this
> Cannot engage in unprompted reflection in the background.
True, but I don't know if that belongs as a requirement to be AGI.
LLMs definitely have this, and it really is bizarre to me that people think otherwise.
> Cannot continually learn and incorporate that learning into their model.
This is definitely a valid criticism of our current LLMs and once we (further) develop ways to do this, I think my main criticism of LLMs as AGI will go away
> Cannot reason on any deep level.
Few people are able to do this
> Cannot engage in unprompted reflection in the background.
True, but I don't know if that belongs as a requirement to be AGI.