|
Answer» Response: Ensemble methods are based on the idea of combining predictions from many so-called base models. They can be seen as a type of meta-algorithms, in the sense that they are methods COMPOSED of other methods. Bagging, boosting are some KEY examples of leveraging ensemble methods. Random FOREST algorithm uses the ensemble approach effectively in specific scenarios. - Bagging – The idea behind bagging is to train multiple models of the same type in parallel and on different versions of training data. By averaging the predictions of the resulting ensemble of models, it is possible to reduce the variance compared to using only a single model. One of the key implementation examples is Random Forest algorithms in this context. Random forests make use of classification or regression trees as base models. Each tree is randomly perturbed in a certain way which opens up for additional variation reduction in the dataset.
- Boosting – Another approach is boosting which is different than the bagging technique and random forests. Its base models are learned sequentially, one after the other. Hence each model tries to correct for the mistakes done by previous models. By considering the weighted AVERAGE of the predictions made by base models, this transforms the ensemble of "weak" models into "strong" models.
|