Hallucination doesn't even mean anything when the machine has zero concept of truth. It's made to generate text mimicking human produced texts. There's a basic comprehension issue about what LLMs are and the people selling those are not here to clarify things.
1
0
3
They picked "Hallucination" because it makes it sound like there's an internal process, that the computer is making a mistake or something. Just admitting that it's all be GIGO the entire time would be pretty fatal to their valuation.
0
0
1