Skip to content

Commit 5e73d67

Browse files
reg1 reg2 learning objs
1 parent cf8a274 commit 5e73d67

File tree

2 files changed

+7
-6
lines changed

2 files changed

+7
-6
lines changed

source/regression1.Rmd

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -63,13 +63,13 @@ however that is beyond the scope of this book.
6363
## Chapter learning objectives
6464
By the end of the chapter, readers will be able to do the following:
6565

66-
* Recognize situations where a simple regression analysis would be appropriate for making predictions.
66+
* Recognize situations where a regression analysis would be appropriate for making predictions.
6767
* Explain the K-nearest neighbors (K-NN) regression algorithm and describe how it differs from K-NN classification.
6868
* Interpret the output of a K-NN regression.
69-
* In a data set with two or more variables, perform K-nearest neighbors regression in R using a `tidymodels` workflow.
70-
* Execute cross-validation in R to choose the number of neighbors.
71-
* Evaluate K-NN regression prediction accuracy in R using a test data set and the root mean squared prediction error (RMSPE).
72-
* In the context of K-NN regression, compare and contrast goodness of fit and prediction properties (namely RMSE vs RMSPE).
69+
* In a data set with two or more variables, perform K-nearest neighbors regression in R.
70+
* Evaluate K-NN regression prediction quality in R using the root mean squared prediction error (RMSPE).
71+
* Estimate the RMSPE in R using cross-validation or a test set.
72+
* Choose the number of neighbors in K-nearest neighbors regression by minimizing estimated cross-validation RMSPE.
7373
* Describe the advantages and disadvantages of K-nearest neighbors regression.
7474

7575
## The regression problem

source/regression2.Rmd

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,9 +51,10 @@ predictor.
5151
## Chapter learning objectives
5252
By the end of the chapter, readers will be able to do the following:
5353

54-
* Use R and `tidymodels` to fit a linear regression model on training data.
54+
* Use R to fit simple and multivariable linear regression models on training data.
5555
* Evaluate the linear regression model on test data.
5656
* Compare and contrast predictions obtained from K-nearest neighbors regression to those obtained using linear regression from the same data set.
57+
* Describe how linear regression is affected by outliers and multicollinearity.
5758

5859
## Simple linear regression
5960

0 commit comments

Comments
 (0)