Machine Learning

Sentence Transformers

A framework for computing dense vector representations (embeddings) for sentences and paragraphs. Built on top of transformer models and optimized for semantic similarity tasks.

Why It Matters

Sentence Transformers are the most practical way to create text embeddings for search, clustering, and RAG. They bridge the gap between raw transformers and usable embeddings.

Example

Using the all-MiniLM-L6-v2 model to embed 1 million FAQ entries, enabling semantic search that finds the right answer even when users phrase questions differently.

Think of it like...

Like a universal translator for meaning — it converts any sentence into a standardized numerical fingerprint that captures its meaning.

Related Terms