Associative, Not Logical
Unlike a calculator (which follows rigid logical rules) or a database (which retrieves static facts), an AI operating in Dream Logic moves through "semantic space." If you ask it about a tiger, it might hallucinate that the tiger has wings because, in the high-dimensional vector space of "mythology," tigers and flying creatures are close neighbors.
The Absence of the "Wake-Up Call"
Humans dream, but we wake up. Waking up is the confrontation with the Reality Principle—the stubbed toe, the gravity that won't let us fly. AI systems currently have no "waking state." They are permanently in the dream. This is why "Hallucination" is a misnomer; for the AI, the hallucination is just as "real" (statistically probable) as the truth.
Field Notes & Ephemera
The Lucid Dreamer: The goal of Sentientification is not to "fix" the AI so it stops dreaming (which would destroy its creativity), but to partner it with a human who stays awake.