How does bootstrap sampling work during boosting




While learning about boosting I have noticed a few things:
There are several modes to boosting: AdaBoost,Gradient Boost and Stochastic gradient boosting.

While Adaboost implements the concept of weighting of records after each classifiers output Stochastic Gradient Boosting uses bootstrap samples to implement boosting.

So while Adaboost uses a single dataset with varied weights each time,Stochastic gbm uses bootstrapped samples each time as can be seen in the above output.
My doubt is that if bootstrapped samples are used each time how is the weights updation being done.
I mean if I again do bootstrap sampling after 1 round,some records whose weights have changed might be left out from the new sample.
Is this the case or am I missing something here??
Can someone please help me with this??


Hello @pagal_guy

AdaBoost : This particular method updates the weights of the different weak learners. So we have the weights altered after every iteration

Gradient Boosting : This particular method does no update the weight but updates the model itself

F_{m+1} = F_m(x) + h(x) = y

Does it help your query? The above is the fundamental difference in how both these algorithms work