Gradient Boosting is a technique to add weaker algorithms together using a procedure similar to gradient descent, in order to minimize the loss function and make them into a stronger algorithm. New ...