@@ -459,7 +459,7 @@ glue("sacr_RMSPE", "{0:,.0f}".format(RMSPE))
459
459
```
460
460
461
461
Our final model's test error as assessed by RMSPE
462
- is {glue: text }` sacr_RMSPE ` .
462
+ is \$ {glue: text }` sacr_RMSPE ` .
463
463
Remember that this is in units of the response variable, and here that
464
464
is US Dollars (USD). Does this mean our model is "good" at predicting house
465
465
sale price based off of the predictor of home size? Again, answering this is
@@ -752,7 +752,7 @@ glue("sacr_mult_RMSPE", "{0:,.0f}".format(lm_mult_test_RMSPE))
752
752
```
753
753
754
754
Our model's test error as assessed by RMSPE
755
- is {glue: text }` sacr_mult_RMSPE ` .
755
+ is \$ {glue: text }` sacr_mult_RMSPE ` .
756
756
In the case of two predictors, we can plot the predictions made by our linear regression creates a * plane* of best fit, as
757
757
shown in {numref}` fig:08-3DlinReg ` .
758
758
@@ -889,12 +889,12 @@ lm_mult_test_RMSPE
889
889
```
890
890
891
891
We obtain an RMSPE for the multivariable linear regression model
892
- of {glue: text }` sacr_mult_RMSPE ` . This prediction error
892
+ of \$ {glue: text }` sacr_mult_RMSPE ` . This prediction error
893
893
is less than the prediction error for the multivariable KNN regression model,
894
894
indicating that we should likely choose linear regression for predictions of
895
895
house sale price on this data set. Revisiting the simple linear regression model
896
896
with only a single predictor from earlier in this chapter, we see that the RMSPE for that model was
897
- {glue: text }` sacr_RMSPE ` ,
897
+ \$ {glue: text }` sacr_RMSPE ` ,
898
898
which is slightly higher than that of our more complex model. Our model with two predictors
899
899
provided a slightly better fit on test data than our model with just one.
900
900
As mentioned earlier, this is not always the case: sometimes including more
0 commit comments