# Bagging

**Bagging** (from *Bootstrap aggregating* ) is a method to combine predictions from different regression or classification models and was developed by Leo Breiman . The results of the models are then averaged in the simplest case, i.e. H. the result of each model forecast is included in the forecast with the same weight.

Ideally, one takes samples of the size of the population and creates predictive models ( ). Predictive values then result for a value . If the predicted value belongs to a class, then the most frequently predicted class could be taken as the predicted value. In the case of regression, the predicted value results as

or generally with weights

- .

The weights, both in the classification and in the regression case, could e.g. B. depend on the quality of the model prediction, d. H. "Good" models are more weighty than "bad" models.

Bagging leads in the case of *unstable* models, i.e. H. Models in which the structure changes significantly depending on the sample data (see e.g. Classification and Regression Trees ), usually lead to significantly improved predictions.

## See also

## Individual evidence

- ^ Leo Breiman: Bagging predictors . In: Machine Learning . 24, No. 2, 1996, pp. 123-140. doi : 10.1007 / BF00058655 .

## literature

- Ian H. Witten, Eibe Frank, Mark A. Hall (2011),
*Data Mining: Practical Machine Learning Tools and Techniques (Third Edition)*, Morgan Kaufmann