Skip to content

Commit 70b4238

Browse files
added under/overfitting to learning objs in cls2, reg1
1 parent 8f72187 commit 70b4238

File tree

2 files changed

+2
-0
lines changed

2 files changed

+2
-0
lines changed

source/classification2.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -39,6 +39,7 @@ By the end of the chapter, readers will be able to do the following:
3939
- Evaluate classification accuracy, precision, and recall in Python using a test set, a single validation set, and cross-validation.
4040
- Produce a confusion matrix in Python.
4141
- Choose the number of neighbors in a K-nearest neighbors classifier by maximizing estimated cross-validation accuracy.
42+
- Describe underfitting and overfitting, and relate it to the number of neighbors in K-nearest neighbors classification.
4243
- Describe the advantages and disadvantages of the K-nearest neighbors classification algorithm.
4344

4445
+++

source/regression1.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,7 @@ By the end of the chapter, readers will be able to do the following:
5858
* Evaluate K-NN regression prediction quality in Python using the root mean squared prediction error (RMSPE).
5959
* Estimate the RMSPE in Python using cross-validation or a test set.
6060
* Choose the number of neighbors in K-nearest neighbors regression by minimizing estimated cross-validation RMSPE.
61+
- Describe underfitting and overfitting, and relate it to the number of neighbors in K-nearest neighbors regression.
6162
* Describe the advantages and disadvantages of K-nearest neighbors regression.
6263

6364
+++

0 commit comments

Comments
 (0)