Decision Tree
A supervised learning algorithm that makes predictions by learning a series of if-then-else decision rules from the data. It creates a tree-like structure where each internal node tests a feature and each leaf provides a prediction.
Why It Matters
Decision trees are one of the most interpretable ML models — you can visually trace exactly why a prediction was made, which is critical for regulated industries.
Example
A decision tree for loan approval: If income > $50K AND credit score > 700 AND debt-to-income < 0.3, then approve. Each branch represents a different decision path.
Think of it like...
Like a flowchart you might use to troubleshoot a problem — 'Is the device plugged in? Yes → Is the power light on? No → Check the fuse' — each question narrows down the answer.
Related Terms
Random Forest
An ensemble learning method that builds multiple decision trees during training and outputs the majority vote (classification) or average prediction (regression) of all the trees. The 'forest' of diverse trees is more robust than any single tree.
Gradient Boosting
An ensemble technique that builds models sequentially, where each new model focuses on correcting the errors made by previous models. It combines many weak learners into a single strong learner.
Ensemble Learning
A strategy that combines multiple models to produce better predictions than any single model alone. Ensemble methods leverage the diversity of different models to reduce errors.
Pruning
A model compression technique that removes unnecessary or redundant weights, neurons, or layers from a trained neural network. Like pruning a plant, it removes parts that are not contributing to overall health.