File tree Expand file tree Collapse file tree 2 files changed +7
-6
lines changed Expand file tree Collapse file tree 2 files changed +7
-6
lines changed Original file line number Diff line number Diff line change @@ -51,13 +51,13 @@ however that is beyond the scope of this book.
51
51
## Chapter learning objectives
52
52
By the end of the chapter, readers will be able to do the following:
53
53
54
- * Recognize situations where a simple regression analysis would be appropriate for making predictions.
54
+ * Recognize situations where a regression analysis would be appropriate for making predictions.
55
55
* Explain the K-nearest neighbors (K-NN) regression algorithm and describe how it differs from K-NN classification.
56
56
* Interpret the output of a K-NN regression.
57
- * In a data set with two or more variables, perform K-nearest neighbors regression in Python using a ` scikit-learn ` workflow .
58
- * Execute cross-validation in Python to choose the number of neighbors .
59
- * Evaluate K-NN regression prediction accuracy in Python using a test data set and the root mean squared prediction error (RMSPE) .
60
- * In the context of K-NN regression, compare and contrast goodness of fit and prediction properties (namely RMSE vs RMSPE) .
57
+ * In a data set with two or more variables, perform K-nearest neighbors regression in Python.
58
+ * Evaluate K-NN regression prediction quality in Python using the root mean squared prediction error (RMSPE) .
59
+ * Estimate the RMSPE in Python using cross-validation or a test set.
60
+ * Choose the number of neighbors in K-nearest neighbors regression by minimizing estimated cross-validation RMSPE.
61
61
* Describe the advantages and disadvantages of K-nearest neighbors regression.
62
62
63
63
+++
Original file line number Diff line number Diff line change @@ -38,9 +38,10 @@ predictor.
38
38
## Chapter learning objectives
39
39
By the end of the chapter, readers will be able to do the following:
40
40
41
- * Use Python and ` scikit-learn ` to fit a linear regression model on training data.
41
+ * Use Python to fit simple and multivariable linear regression models on training data.
42
42
* Evaluate the linear regression model on test data.
43
43
* Compare and contrast predictions obtained from K-nearest neighbors regression to those obtained using linear regression from the same data set.
44
+ * Describe how linear regression is affected by outliers and multicollinearity.
44
45
45
46
+++
46
47
You can’t perform that action at this time.
0 commit comments