A decision stump is a decision tree with only one internal node (and consequently two leaves). It is basically a one-level decision tree that makes a decision based on a single feature: “if feature f <= θ then predict class A else class B”. Decision stumps are weak learners since they have very limited expressiveness. However, they are often used in ensemble methods like AdaBoost: each stump focuses on one aspect of the data, and many stumps together (each perhaps using a different feature or threshold) can form a strong classifier. They can also serve as baseline models for binary classification tasks. In regression, a decision stump would predict a constant value for one branch and another constant for the other branch.
Data Selection & Data Viewer
Get data insights and find the perfect selection strategy
Learn MoreSelf-Supervised Pretraining
Leverage self-supervised learning to pretrain models
Learn MoreSmart Data Capturing on Device
Find only the most valuable data directly on devide
Learn More