|
Answer» If we can detect overfitting at an early stage, it will be very useful for our training model. There are several methods up our sleeves that can be used to avoid overfitting- - Cross-validation: Cross-validation is a resampling technique for evaluating machine learning models on a small sample of data.
- Remove features: We can remove the unnecessary features of the models to encompass the outliers.
- Early stopping: Early stopping is a type of REGULARIZATION used in machine learning to MINIMIZE overfitting when using an iterative METHOD like gradient descent to train a learner. Early stopping CRITERIA specify how many iterations can be completed before the learner becomes over-fit.
- Training with more data: We can train our model with more data to accommodate outliers.
- Regularization: In machine learning, regularization is a method to solve the over-fitting problem by adding a penalty term with the cost function.
- Ensembling: Ensemble learning refers to combining the predictions from two or more models.
|