- Start: Monday, February 8
- End: Friday, February 12
This week we will continue discussing the regression task. We will introduce to new methods, k-nearest neighbors and decisions trees, which will serve as examples of nonparametric modeling techniques. We will also discuss model flexibility and how it relates to the overfitting and bias-variance tradeoff.
- Keywords: Nonparametric Regression, k-Nearest Neighbors, Decision Trees, Model Flexibility, Tuning Parameters, Bias-Variance Tradeoff, Overfitting, No Free Lunch, Curse of Dimensionality
After completing this week, you are expected to be able to:
- Differentiate between parametric and nonparametric regression.
- Understand how model flexibility relates to the bias-variance tradeoff and thus model performance.
- Use R packages and functions to fit KNN and decision tree models and make predictions or estimate conditional means.
- Select models by manipulating their flexibility through the use of a tuning parameter.
- Avoid overfitting by selecting an a model of appropriate flexibility through the use of a validation set.