Improve Your Regression with CART and Gradient Boosting

On-Demand: Fill out the form to the right to receive an on-demand version of this webinar

Duration: 55 minutes

Speaker: Charles Harrison, Marketing Statistician, Salford Systems

Cost: Free 


Abstract: In this webinar we'll introduce you to a powerful tree-based machine learning algorithm called gradient boosting. Gradient boosting often outperforms linear regression, Random Forests, and CART. Boosted trees automatically handle variable selection, variable interactions, nonlinear relationships, outliers, and missing values.

We'll see that CART decision trees are the foundation of gradient boosting and discuss some of the advantages of boosting versus a Random Forest. We will explore the gradient boosting algorithm and discuss the most important modeling parameters like the learning rate, number of terminal nodes, number of trees, loss functions, and more. We will demonstrate using an implementation of gradient boosting (TreeNet® Software) to fit the model and compare the performance to a linear regression model, a CART tree, and a Random Forest.

Tags: CART, Gradient Boosting, TreeNet, Random Forest, linear regression, machine learning, variable selection, variable interactions, nonlinear relationships, outliers, missing values, loss functions, learning rate