It's not an exaggeration it's a non-sequitur, you first have to show that the LLMs are reasoning in the same way humans do.
It's not an exaggeration it's a non-sequitur, you first have to show that the LLMs are reasoning in the same way humans do.