Artificial Intelligence

FLOPS

Floating Point Operations Per Second — a measure of computing speed that quantifies how many mathematical calculations a processor can perform each second. Used to measure AI hardware performance.

Why It Matters

FLOPS is the benchmark for comparing AI hardware capability and estimating training costs. It is how the industry measures and plans compute investments.

Example

An NVIDIA H100 GPU delivers about 4 petaFLOPS (4 quadrillion operations per second) for AI-specific computation.

Think of it like...

Like horsepower for computers — it measures raw computational muscle, telling you how much mathematical work a chip can do in a given time.

Related Terms