How to use xgb.importance



This may be a very basic question-

I am using XGB on data with 100+ parameter. When I use xgb.importance I get to see top 5 and bottom 5 parameters. How can I get top 50 or better - list of all parameters ?

Also, how much one should rely on these sequence provided?

Thanks for your time


Hello @sadashivb ,

You can do following,

Get the feature real names

names = dimnames(trainMatrix)[[2]]

Compute feature importance matrix

importance_matrix = xgb.importance(names, model = bst)

Nice graph


Hope this helps,


Thanks @nilesh_borade .

Sharing more details regarding same. There are also inputs from @binga ,on how to find feature importance in python.


Hi @sadashivb,

To add to discussion, to get top 50, we can list the attributes after computing importance matrix using importance_matrix[1:50]

To view all important features for the model, we can use data.frame(importance_matrix$Feature) ; this will list all of them in decreasing order of gain