Skip to content

Commit 8f72187

Browse files
reg1 reg2 learning objs
1 parent 10f85d1 commit 8f72187

File tree

2 files changed

+7
-6
lines changed

2 files changed

+7
-6
lines changed

source/regression1.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -51,13 +51,13 @@ however that is beyond the scope of this book.
5151
## Chapter learning objectives
5252
By the end of the chapter, readers will be able to do the following:
5353

54-
* Recognize situations where a simple regression analysis would be appropriate for making predictions.
54+
* Recognize situations where a regression analysis would be appropriate for making predictions.
5555
* Explain the K-nearest neighbors (K-NN) regression algorithm and describe how it differs from K-NN classification.
5656
* Interpret the output of a K-NN regression.
57-
* In a data set with two or more variables, perform K-nearest neighbors regression in Python using a `scikit-learn` workflow.
58-
* Execute cross-validation in Python to choose the number of neighbors.
59-
* Evaluate K-NN regression prediction accuracy in Python using a test data set and the root mean squared prediction error (RMSPE).
60-
* In the context of K-NN regression, compare and contrast goodness of fit and prediction properties (namely RMSE vs RMSPE).
57+
* In a data set with two or more variables, perform K-nearest neighbors regression in Python.
58+
* Evaluate K-NN regression prediction quality in Python using the root mean squared prediction error (RMSPE).
59+
* Estimate the RMSPE in Python using cross-validation or a test set.
60+
* Choose the number of neighbors in K-nearest neighbors regression by minimizing estimated cross-validation RMSPE.
6161
* Describe the advantages and disadvantages of K-nearest neighbors regression.
6262

6363
+++

source/regression2.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,9 +38,10 @@ predictor.
3838
## Chapter learning objectives
3939
By the end of the chapter, readers will be able to do the following:
4040

41-
* Use Python and `scikit-learn` to fit a linear regression model on training data.
41+
* Use Python to fit simple and multivariable linear regression models on training data.
4242
* Evaluate the linear regression model on test data.
4343
* Compare and contrast predictions obtained from K-nearest neighbors regression to those obtained using linear regression from the same data set.
44+
* Describe how linear regression is affected by outliers and multicollinearity.
4445

4546
+++
4647

0 commit comments

Comments
 (0)