Artificial Intelligence

BERT

Bidirectional Encoder Representations from Transformers — a language model developed by Google that reads text in both directions simultaneously. BERT excels at understanding language rather than generating it.

Why It Matters

BERT revolutionized search engines and NLP tasks like question answering and sentiment analysis. Google uses BERT to better understand search queries.

Example

Google Search using BERT to understand that in 'parking on a hill with no curb,' the word 'no' is critical and changes the entire meaning of the query.

Think of it like...

Like reading a mystery novel where you already know the ending — understanding the full context helps you interpret every clue more accurately.

Related Terms