Machine Learning

Activation Function

A mathematical function applied to the output of each neuron in a neural network that introduces non-linearity. Without activation functions, a neural network would just be a series of linear transformations.

Why It Matters

Activation functions determine how neurons fire and enable neural networks to learn complex, non-linear patterns that exist in real-world data.

Example

ReLU (Rectified Linear Unit) outputs the input if positive and zero otherwise — simple but highly effective for training deep networks.

Think of it like...

Like a light dimmer switch that decides how much signal to pass through — it controls whether and how strongly a neuron activates.

Related Terms