Hacker News new | past | comments | ask | show | jobs | submit login

I wonder if people are wrongly just going to start calling random, completely-unrelated-to-LLM bugs in software "hallucination". Very similar to the meaning of AI changing in recent years to include basically any software that has some type of algorithm or heuristic built into it.



Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: