File tree Expand file tree Collapse file tree 2 files changed +7
-6
lines changed Expand file tree Collapse file tree 2 files changed +7
-6
lines changed Original file line number Diff line number Diff line change @@ -63,13 +63,13 @@ however that is beyond the scope of this book.
63
63
## Chapter learning objectives
64
64
By the end of the chapter, readers will be able to do the following:
65
65
66
- * Recognize situations where a simple regression analysis would be appropriate for making predictions.
66
+ * Recognize situations where a regression analysis would be appropriate for making predictions.
67
67
* Explain the K-nearest neighbors (K-NN) regression algorithm and describe how it differs from K-NN classification.
68
68
* Interpret the output of a K-NN regression.
69
- * In a data set with two or more variables, perform K-nearest neighbors regression in R using a ` tidymodels ` workflow .
70
- * Execute cross-validation in R to choose the number of neighbors .
71
- * Evaluate K-NN regression prediction accuracy in R using a test data set and the root mean squared prediction error (RMSPE) .
72
- * In the context of K-NN regression, compare and contrast goodness of fit and prediction properties (namely RMSE vs RMSPE) .
69
+ * In a data set with two or more variables, perform K-nearest neighbors regression in R.
70
+ * Evaluate K-NN regression prediction quality in R using the root mean squared prediction error (RMSPE) .
71
+ * Estimate the RMSPE in R using cross-validation or a test set.
72
+ * Choose the number of neighbors in K-nearest neighbors regression by minimizing estimated cross-validation RMSPE.
73
73
* Describe the advantages and disadvantages of K-nearest neighbors regression.
74
74
75
75
## The regression problem
Original file line number Diff line number Diff line change @@ -51,9 +51,10 @@ predictor.
51
51
## Chapter learning objectives
52
52
By the end of the chapter, readers will be able to do the following:
53
53
54
- * Use R and ` tidymodels ` to fit a linear regression model on training data.
54
+ * Use R to fit simple and multivariable linear regression models on training data.
55
55
* Evaluate the linear regression model on test data.
56
56
* Compare and contrast predictions obtained from K-nearest neighbors regression to those obtained using linear regression from the same data set.
57
+ * Describe how linear regression is affected by outliers and multicollinearity.
57
58
58
59
## Simple linear regression
59
60
You can’t perform that action at this time.
0 commit comments