-
In this course, you'll learn how to use tree-based models and ensembles for regression and classification.
In this course you'll learn how to work with tree-based models in R. This course covers everything from using a single tree for regression or classification to more advanced ensemble methods. You'll learn to implement bagged trees, Random Forests, and boosted trees using the Gradient Boosting Machine, or GBM. These powerful techinques will allow you to create high performance regression and classification models for your data.
-
Classification Trees
-This chapter covers supervised machine learning with classification trees.
Regression Trees
-In this chapter you'll learn how to use a single tree for regression, instead of classification.
agged Trees
-In this chapter, you will learn about Bagged Trees, an ensemble method, that uses a combination of trees (instead of only one).
Random Forests
-In this chapter, you will learn about the Random Forest algorithm, another tree-based ensemble method. Random Forest is a modified version of bagged trees with better performance. Here you'll learn how to train, tune and evaluate Random Forest models in R.
oosted Trees
-In this chapter, you will see the boosting methodology with a focus on the Gradient Boosting Machine (GBM) algorithm, another popular tree-based ensemble method. Here you'll learn how to train, tune and evaluate GBM models in R.