Hacker News new | past | comments | ask | show | jobs | submit login

"There are no LLMS that reason" is a claim about language, namely that the word 'reason' can only ever be applied to humans.



Not at all, we are building conceptual reasoning machines, but it is an entirely different technology than GPT/LLM dl/ml etc. [1]

[1] https://graphmetrix.com/trinpod-server


Conceptual reasoning machines rely on concrete, explicit and intelligble concepts and rules. People like this because it 'looks' like reasoning on the inside.

However, our brains, like language models, rely on implicit, distributed representations of concepts and rules.

So the intelligble representations of conceptual reasoning machines are maybe too strong a requirement for 'reasoning' unless you want to exclude humans too.


It’s also possible that you do not have information on our technology which models conceptual awareness of matter and change through space-time which is different than any previous attempts?


Is it possible that you don't quite understand LLMs?


That's called talking your book. Please spell out how a document indexing and retrieval system is more akin to a "conceptual reasoning machine" compared to o3?


If LLMs can't reason, then this cannot either - whatever this is supposed to be. Not a good argument. Also, since you're apparently working on that product: 'It is difficult to get a man to understand something when his salary depends on his not understanding it.'




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: