classification_metrics: Calculate classification metrics on a confusion matrix
Description
In some cases, the class correctness or the proportion of correctly classified
individuals is not enough, so here are more detailed metrics when working on classification.
Usage
classification_metrics(x)
Value
a list with the following components is returned:
accuracy the fraction of instances that are correctly classified
macro_prf data.frame containing precision
(the fraction of correct predictions for a certain class);
recall, the fraction of instances of a class that were correctly predicted;
f1 the harmonic mean (or a weighted average) of precision and recall.
macro_avg, just the average of the three macro_prf indices
ova a list of one-vs-all confusion matrices for each class
ova_sum a single of all ova matrices
kappa measure of agreement between the predictions and the actual labels
Arguments
x
a table or an LDA object
See Also
The pages below are of great interest to understand these metrics. The code
used is partley derived from the Revolution Analytics blog post (with their authorization). Thanks to them!
# some morphometrics on 'hearts'hearts %>% fgProcrustes(tol=1) %>%
coo_slide(ldk=1) %>% efourier(norm=FALSE) %>% PCA() %>%
# now the LDA and its summaryLDA(~aut) %>% classification_metrics()