Big change in accuracy xgboost classification

machine_learning
xgboost
python

#1

#why there is big change in accuracy changing learning_rate =0.1 to 0.7

xgb2 = xgb.XGBClassifier(
learning_rate =0.1,
n_estimators=1000,
max_depth=5,
min_child_weight=12,
gamma=0.3,
subsample=0.8,
colsample_bytree=0.8,
objective= ‘binary:logistic’,
nthread=4,
scale_pos_weight=1,
seed=27)

##Accuracy for model : 75.32(learning_rate =0.1)
##Accuracy for model : 71.43(learning_rate =0.7)


#2

Hey,
can you give us some information about your dataset (num feats, num samples, data split, type of variables, etc)?

Normally, larger values of learning_rate make the algorithm learn faster, but chances are that it misses an optimal point. Whereas lower values ensures helps finding an optimal minimum error point however it does that slower so you need more training…

In your case, it is hard to estimate where the problem is without more info about your data and training/evaluation process.

Jose


#3

data split =80/20,there is something like 43 variables like 60%numerical and 40%categorical ,rest of them are default values

THe point you mentioned about learning rate is applicable mostly when learning_rate changes from 0.1 to 0.01.I think mine is due to some other reason.