How to Tune the Number and Size of Decision Trees with XGBoost in Python

Last Updated on August 27, 2020

Gradient boosting involves the creation and addition of decision trees sequentially, each attempting to correct the mistakes of the learners that came before it.

This raises the question as to how many trees (weak learners or estimators) to configure in your gradient boosting model and how big each tree should be.

In this post you will discover how to design a systematic experiment to select the number and size of decision trees to use on your problem.

After reading this post you will know:

  • How to evaluate the effect of adding more decision trees to your XGBoost model.
  • How to evaluate the effect of creating larger decision trees to your XGBoost model.
  • How to investigate the relationship between the number and depth of trees on your problem.

Kick-start your project with my new book XGBoost With Python, including step-by-step tutorials and the Python source code files for all examples.

Let’s get started.

  • Update Jan/2017: Updated to reflect changes in scikit-learn API version 0.18.1.
How to Tune the Number and Size of Decision Trees with XGBoost in PythonTo finish reading, please visit source site