Improve Accuracy with Ensemble Predictions

Another way that you can improve the performance of your models is to combine the predictions from multiple models. This phenomenon is called Ensemble Predictions

Some models provide this capability built-in such as random forest for bagging and stochastic gradient boosting for boosting. Another type of ensembling called voting can be used to combine the predictions from multiple different models together.
In today’s lesson, you will practice using ensemble methods.

  1. Practice bagging ensembles with the Random Forest and Extra Trees algorithms.
  2. Practice boosting ensembles with the Gradient Boosting Machine and AdaBoost algorithms.
  3. Practice voting ensembles using by combining the predictions from multiple models together.

The snippet below demonstrates how you can use the Random Forest algorithm (a bagged ensemble of decision trees) on the Pima Indians onset of diabetes data set.

In the next lesson, you will discover how to finalize your model.

You might also like More from author