4 Classification Introduction
- Start: Monday, February 15
- End: Friday, February 19
4.1 Summary
This week we will introduce our second machine learning task: classification. After introducing the task, we will see how to re-use methods we have already learned to perform the task. This week, we will focus on on nonparametric classification techniques, in particular KNN and decision trees.
- Keywords: Classification, Bayes Classifier, Bayes Error, Nonparametric Classification, k-Nearest Neighbors, Decision Trees, Misclassification Rate, Accuracy
4.2 Learning Objectives
After completing this week, you are expected to be able to:
- Differentiate between regression and classification tasks.
- Estimate and calculate conditional probabilities.
- Understand how conditional probabilities relate to classifications.
- Use R packages and functions to fit KNN and decision tree models and make classification or estimate conditional probabilities.
- Calculate classification metrics such as accuracy and misclassification rate.
- Select models by manipulating their flexibility through the use of a tuning parameter.
- Avoid overfitting by selecting an a model of appropriate flexibility through the use of a validation set.
4.4 Video
Title | Link | Mirror |
---|---|---|
4.1 - Welcome to Week 04 | 4.1 - YouTube | 4.1 - ClassTranscribe |
4.2 - Classification Introduction | 4.2 - YouTube | 4.2 - ClassTranscribe |
4.3 - Nonparametric Classification | 4.3 - YouTube | 4.3 - ClassTranscribe |
4.4 - Classification in R | 4.4 - YouTube | 4.4 - ClassTranscribe |
4.6 Office Hours
Staff and Link | Day | Time |
---|---|---|
Zoom with David | Monday | 8:00 PM - 9:00 PM |
Zoom with Tianyi | Monday | 9:00 PM - 10:00 PM |
Zoom with David | Thursday | 8:00 PM - 9:00 PM |
Zoom with Tianyi | Thursday | 9:00 PM - 10:00 PM |
Piazza | Any! | Any! |