Feature Importance and Feature Selection With XGBoost in Python
Last Updated on August 27, 2020 A benefit of using ensembles of decision tree methods like gradient boosting is that they can automatically provide estimates of feature importance from a trained predictive model. In this post you will discover how you can estimate the importance of features for a predictive modeling problem using the XGBoost library in Python. After reading this post you will know: How feature importance is calculated using the gradient boosting algorithm. How to plot feature importance […]
Read more