Difference between Gradient Boosting and Adaboost?



What are the main differences between Adaboost and Gradient Boosting


In simple terms, GB works with residual errors based on decision tree. And Adaboost works with combination of weak learners.


Thank you…
In GB while building the next weak learner we also add the residual error of the previous learner also… Am I correct?


Hi @raviteja1993,

Along with what @mahesh51 has mentioned, you can also refer the below mentioned article which explains both of these algorithms with example:


@PulkitS Thank you