Complete Guide to Parameter Tuning in Gradient Boosting (GBM) in Python



While searching for the steps or technique to optimize gradient boosting performance, I came across to the article mentioned in the link.

The article explains very well about the trade-off between few parameters and how to evaluate the performance. But what I feel is there is no where mention about the issue of overfitting. How can we minimize the issue of overfitting


Hi @nirvgandhi,

It has been mentioned in the article itself that we can adjust the nim_sample_leaf and n_estimators to prevent overfitting.