adaboost is a predictive algorithm for classification and regression.
adaboost (adaptive boosting) is an ensemble learning algorithm that can be used for classification or regression. although adaboost is more resistant to overfitting than many machine learning algorithms, it is often sensitive to noisy data and outliers.
adaboost is called adaptive because it uses multiple iterations to generate a single composite strong learner. adaboost creates the strong learner (a classifier that is well-correlated to the true classifier) by iteratively adding weak learners (a classifier that is only slightly correlated to the true classifier). during each round of training, a new weak learner is added to the ensemble and a weighting vector is adjusted to focus on examples that were misclassified in previous rounds. the result is a classifier that has higher accuracy than the weak learners’ classifiers.
adaptive boosting includes the following algorithms:
- adaboost.m1 and adaboost.m2 – original algorithms for binary and multiclass classification
- logitboost – binary classification (for poorly separable classes)
- gentle adaboost or gentleboost – binary classification (for use with multilevel categorical predictors)
- robustboost – binary classification (robust against label noise)
- lsboost – least squares boosting (for regression ensembles)
- lpboost – multiclass classification using linear programming boosting
- rusboost – multiclass classification for skewed or imbalanced data
- totalboost – multiclass classification more robust than lpboost
for more information on adaptive boosting, see statistics and machine learning toolbox™.
examples and how to
software reference
see also: machine learning, support vector machine