Chomsky said that LLMs are statistical regurgitators which means LLMs can never actually reason and explain which language understanding requires. That they are a wrong model of computation by definition.
It's an interesting position and I'm sympathetic toward it, he could be partly right in the end.
Regurgitators can't have internal representations? Sometimes the best way to regurgitate is to learn an internal representation. That doesn't mean it suddenly stopped being a statistical model.
It's an interesting position and I'm sympathetic toward it, he could be partly right in the end.