Artificial Intelligence

Sequence-to-Sequence

A model architecture that transforms one sequence into another, where the input and output can be different lengths. It uses an encoder to process input and a decoder to generate output.

Why It Matters

Seq2seq models power machine translation, text summarization, question answering, and code generation — any task where you need to convert one form of text to another.

Example

A translation model taking the English sequence 'How are you?' and generating the French sequence 'Comment allez-vous?' — different words, different length.

Think of it like...

Like a simultaneous interpreter at the UN who listens to an entire thought in one language and then reproduces it in another — the input and output do not need to match word-for-word.

Related Terms