Backpropagation
The primary algorithm used to train neural networks. It calculates how much each weight in the network contributed to the error, then adjusts weights backward from the output layer to reduce future errors.
Why It Matters
Backpropagation is what makes neural network training possible. Without it, deep learning as we know it would not exist.
Example
After a neural network incorrectly classifies a photo of a cat as a dog, backpropagation traces back through the network to adjust the weights that led to the wrong answer.
Think of it like...
Like a teacher grading a group project and tracing back to figure out which team member's mistake led to the wrong answer, then giving each person specific feedback.
Related Terms
Gradient Descent
An optimization algorithm used to minimize the error (loss) of a model by iteratively adjusting parameters in the direction that reduces the loss most quickly. It is the primary method for training machine learning models.
Loss Function
A mathematical function that measures how far a model's predictions are from the actual correct values. The goal of training is to minimize this loss function, making predictions as accurate as possible.
Neural Network
A computing system inspired by the biological neural networks in the human brain. It consists of interconnected nodes (neurons) organized in layers that process information and learn to recognize patterns.
Weight
A numerical parameter in a neural network that is learned during training. Weights determine the strength of connections between neurons and collectively encode the model's knowledge.
Learning Rate
A hyperparameter that controls how much the model's weights are adjusted in response to errors during each training step. It determines the size of the steps taken during gradient descent optimization.