Hacker News new | past | comments | ask | show | jobs | submit login

You might be conflating the epistemological point with Turing's test, et cetera. I could not agree more that indistinguishability is a key metric. These days, it is quite possible (at least for me) to distinguish LLM outputs from those of a thinking human, but in the future that could change. Whether LLMs "think" is not an interesting question because these are algorithms, people. Algorithms do not think.



Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: