Artificial Intelligence

Long Context

The ability of AI models to process very large amounts of input text — typically 100K tokens or more — enabling analysis of entire books, codebases, or document collections.

Why It Matters

Long context eliminates the need to chunk and summarize inputs, enabling direct analysis of complete documents and reducing information loss.

Example

Claude processing a 200K-token legal contract in its entirety, cross-referencing clauses from page 3 with definitions on page 150 — impossible with shorter context windows.

Think of it like...

Like having a desk large enough to spread out an entire newspaper versus only seeing one article at a time — more context enables better understanding.

Related Terms