@@ -456,8 +456,8 @@ the model and returns the RMSPE for each number of neighbors. In the output of t
456
456
results data frame, we see that the ` neighbors ` variable contains the value of $K$,
457
457
the mean (` mean ` ) contains the value of the RMSPE estimated via cross-validation,
458
458
and the standard error (` std_err ` ) contains a value corresponding to a measure of how uncertain we are in the mean value. A detailed treatment of this
459
- is beyond the scope of this chapter; but roughly, if your estimated mean is 100,000 and standard
460
- error is 1,000, you can expect the * true* RMSPE to be somewhere roughly between 99,000 and 101,000 (although it may
459
+ is beyond the scope of this chapter; but roughly, if your estimated mean RMSPE is \$ 100,000 and standard
460
+ error is \$ 1,000, you can expect the * true* RMSPE to be somewhere roughly between \$ 99,000 and \$ 101,000 (although it may
461
461
fall outside this range). You may ignore the other columns in the metrics data frame,
462
462
as they do not provide any additional insight.
463
463
\index{cross-validation!collect\_ metrics}
@@ -763,9 +763,9 @@ predictor *as part of the model tuning process* (e.g., if we are running forward
763
763
in the chapter on evaluating and tuning classification models),
764
764
then we must compare the accuracy estimated using only the training data via cross-validation.
765
765
Looking back, the estimated cross-validation accuracy for the single-predictor
766
- model was ` r format(round(sacr_min$mean), big.mark=",", nsmall=0, scientific = FALSE) ` .
766
+ model was \$ ` r format(round(sacr_min$mean), big.mark=",", nsmall=0, scientific = FALSE) ` .
767
767
The estimated cross-validation accuracy for the multivariable model is
768
- ` r format(round(sacr_multi$mean), big.mark=",", nsmall=0, scientific = FALSE) ` .
768
+ \$ ` r format(round(sacr_multi$mean), big.mark=",", nsmall=0, scientific = FALSE) ` .
769
769
Thus in this case, we did not improve the model
770
770
by a large amount by adding this additional predictor.
771
771
@@ -797,7 +797,7 @@ knn_mult_mets
797
797
798
798
This time, when we performed KNN regression on the same data set, but also
799
799
included number of bedrooms as a predictor, we obtained a RMSPE test error
800
- of ` r format(round(knn_mult_mets |> pull(.estimate)), big.mark=",", nsmall=0, scientific=FALSE) ` .
800
+ of \$ ` r format(round(knn_mult_mets |> pull(.estimate)), big.mark=",", nsmall=0, scientific=FALSE) ` .
801
801
Figure \@ ref(fig:07-knn-mult-viz) visualizes the model's predictions overlaid on top of the data. This
802
802
time the predictions are a surface in 3D space, instead of a line in 2D space, as we have 2
803
803
predictors instead of 1.
0 commit comments