— Concepts
Hallucination
When an AI confidently generates information that isn't true.
What is Hallucination?
Hallucination is when an LLM produces output that sounds confident but isn't factually accurate — invented citations, wrong numbers, fake quotes. Mitigation: ground responses in retrieved documents (RAG), require citations, set temperature lower, ask the model to express uncertainty when unsure. Still a top open problem in 2026.
— Related
Terms connected to Hallucination
Techniques
RAG (Retrieval-Augmented Generation)
Giving an AI access to your private documents so it can answer questions about them.
Open →Techniques
Prompt Engineering
The craft of designing inputs to LLMs to get reliable, high-quality outputs.
Open →Tools
Perplexity
An AI-powered search engine that cites its sources.
Open →— Apply this
From definitions to deployed projects.
Knowing what a term means is step one. ONROL's AI Generalist track gets you shipping projects that use it.
Reserve Free Masterclass