unearth.wiki

Ungrounded Generation

/ʌnˈgraʊn.dɪd ˌdʒɛn.əˈreɪ.ʃən/ Precise technical term for "Hallucination."
Definition The production of syntactically coherent and semantically plausible content that lacks factual basis in the shared physical world. Unlike "hallucination" (which implies a perceptual error), this term correctly identifies the error as a lack of grounding—the system is generating valid statistical probabilities that happen to be factually false.

Plausibility Over Truth

LLMs are optimized for plausibility (Does this sound like something a human would say?) rather than truth (Does this correspond to reality?). Therefore, Ungrounded Generation is not a bug; it is a feature of the architecture. The model is successfully doing what it was trained to do: predict the next likely token.

The "Bullshit" Theory

Philosopher Harry Frankfurt defined "bullshit" not as lying (which requires knowing the truth and hiding it) but as a lack of concern for the truth. In this strict philosophical sense, Ungrounded Generation is the automated production of bullshit. The AI doesn't know it's lying; it's just trying to fill the silence with something that sounds good.

Field Notes & Ephemera

Diagnostic Shift: Calling it "Ungrounded Generation" reminds the user that the missing piece is Grounding—which only the human can provide. It shifts the focus from "fixing the bot" to "steering the system."
Stratigraphy (Related Concepts)
Hallucination Crisis The Reality Principle Vicarious Grounding Dream Logic The Sycophancy Problem