Improve Your CLASSIFICATION with CART and Gradient Boosting

In this webinar we'll introduce you to two tree-based machine learning algorithms, CART decision trees and Gradient Boosting. Both of these methods can be used for either regression or classification (i.e. Y = “Application Denied” or “Application Accepted”) and we will focus on classification in this presentation. Gradient boosting often outperforms linear regression, Random Forests, and CART. Boosted trees automatically handle variable selection, variable interactions, nonlinear relationships, outliers, and missing values.
We'll see that CART decision trees are the foundation of gradient boosting and discuss some of the advantages of boosting versus a Random Forest. We will explore the gradient boosting algorithm and discuss the most important modeling parameters like the learning rate, number of terminal nodes, number of trees, loss functions, and more. We will demonstrate using an implementation of gradient boosting (TreeNet® Software) to fit the model and compare the performance to a linear regression model, a CART tree, and a Random Forest.