What is the difference between softprob and softmax in Xgboost




While trying to implement Xgboost in R,I came across the code:

param <- list("objective" = "multi:softprob",    # multiclass classification 
              "num_class" = 12,    # number of classes 
              "eval_metric" = "merror",    # evaluation metric 
              "nthread" = 6,   # number of threads to be used 
              "max_depth" = 15,    # maximum depth of tree 
              "eta" = 0.07,    # step size shrinkage 
              "subsample" = 0.8,    # part of data instances to grow tree 
              "colsample_bytree" = 0.9  # subsample ratio of columns when constructing each tree 

Here for multiclass we can use either softprob or softmax.
Can someone help me in understanding the difference between the two??


Hi @hackers,

The difference is explained here.

Basically both softmax and softprob are used for multiclass classification. It’s the output which separates them. In Softmax you will get the class with the maximum probability as output, but with Softprob you will get a matrix with probability value of each class you are trying to predict.

Hope this helps!