Skip to content

Commit 3298017

Browse files
wordsmithing text lead-in to prec/rec
1 parent c1c2b9c commit 3298017

File tree

1 file changed

+4
-6
lines changed

1 file changed

+4
-6
lines changed

source/classification2.Rmd

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -160,12 +160,10 @@ classifier can make, corresponding to the four entries in the confusion matrix:
160160
- **True Negative:** A benign observation that was classified as benign (bottom right in Table \@ref(tab:confusion-matrix)).
161161
- **False Negative:** A malignant observation that was classified as benign (bottom left in Table \@ref(tab:confusion-matrix)).
162162

163-
A perfect classifier would have zero False Negatives and False Positives.
164-
However,
165-
in practice there will always be some error in the model,
166-
so it is important to think about what type of errors we think are more harmful,
167-
which we discuss in detail in section \@ref(critically-analyze-performance)
168-
Two commonly used metrics that capture part of what the confusion matrix is telling us are the **precision** and **recall** of the classifier which are often reported together with accuracy.
163+
A perfect classifier would have zero false negatives and false positives (and therefore, 100% accuracy).
164+
However, real classifiers in practice will almost always make some mistakes, so it is important to think
165+
about what type of error is more harmful. Two commonly used metrics that we can compute using the confusion matrix
166+
are the **precision** and **recall** of the classifier. These are often reported together with accuracy.
169167
*Precision* quantifies how many of the positive predictions the classifier made were actually positive. Intuitively,
170168
we would like a classifier to have a *high* precision: for a classifier with high precision, if the
171169
classifier reports that a new observation is positive, we can trust that that

0 commit comments

Comments
 (0)