Artificial Intelligence

Hallucination Rate

The frequency at which an AI model generates incorrect or fabricated information. It is typically measured as a percentage of responses containing hallucinations.

Why It Matters

Hallucination rate is a key metric for evaluating LLM trustworthiness. Reducing it from 20% to 2% can make the difference between a useful and a dangerous system.

Example

Testing an LLM on 1,000 factual questions and finding that 35 responses contained fabricated information — a 3.5% hallucination rate.

Think of it like...

Like an error rate in manufacturing — a 5% defect rate might be acceptable for toys but catastrophic for medical devices.

Related Terms