Confabulation

The production of confidently stated but erroneous or false content by GAI systems, where users may be misled or deceived. Includes generated outputs that diverge from prompts, contradict previous statements in the same context, or present fabricated information as factual.

The term is deliberately chosen over “hallucination” or “fabrication” because those terms anthropomorphize AI systems, inappropriately attributing human cognitive states to statistical prediction engines. Confabulation is a natural result of how generative models work: they predict outputs that approximate the statistical distribution of training data, which can produce plausible but false content.

Related: 05-atom—uniform-confidence-problem