Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> There isn't really such thing as a "hallucination" and honestly I think people should be using the word less. Whether an LLM tells you the sky is blue or the sky is purple, it's not doing anything different. It's just spitting out a sequence of characters it was trained be hopefully what a user wants. There is no definable failure state you can call a "hallucination," it's operating as correctly as any other output.

This is a very "closed world" view of the phenomenon which looks at an LLM as a software component on its own.

But "hallucination" is a user experience problem, and it describes the experience very well. If you are using a code assistant and it suggests using APIs that don't exist then the word "hallucination" is entirely appropriate.

A vaguely similar analogy is the addition of the `let` and `const` keywords in JS ES6. While the behavior of `var` was "correct" as-per spec the user experience was horrible: bug prone and confusing.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: