Artificial Intelligence

Retrieval-Augmented Generation

A technique that enhances LLM outputs by first retrieving relevant information from external knowledge sources and then using that information as context for generation. RAG combines the power of search with the fluency of language models.

Why It Matters

RAG solves the knowledge cutoff problem, reduces hallucinations, and lets organizations use LLMs with their proprietary data without fine-tuning. It is the most popular enterprise AI pattern.

Example

A customer support chatbot that searches your company's knowledge base for relevant articles before generating an answer, ensuring responses are grounded in actual documentation.

Think of it like...

Like a student who is allowed to use their textbook during an exam — they combine their understanding with referenced material to give more accurate, well-supported answers.

Related Terms