How are the prediction scores at leaves of each tree in XGboost are calculated .??
Suppose your problem is to predict the income of a person based on other features provided. Using an XGboost algorithm, the first set of predictions is assumed to be the mean value of income. Using this mean value and the actual value, the error can be calculated.
The next tree uses these errors as the target variable. The leaf nodes will have predicted error values for each row. These predicted values (may be positive or negative) are added to the mean value resulting in a new set of income predictions. These are compared with the actual values again and errors are calculated.
The complete process in repeated till the error values do not change.
You need to check on Boosting and bagging (ensemble methods).
Bagging is gathering the results and choose the best by vote and every steps are isolated from the other in error calculation.
Boosting is using every previous steps in enhancing the error so in the end you have the best models.