Learn R Programming

rtemis (version 0.79)

auc: Area under the ROC Curve

Description

Get the Area under the ROC curve to assess classifier performance using pROC

Usage

auc(prob, labels, method = c("auc_pairs", "pROC", "ROCR"),
  verbose = FALSE)

Arguments

prob

Float, Vector: Probabilities or model scores (e.g. c(.32, .75, .63), etc)

labels

True labels of outcomes (e.g. c(0, 1, 1))

method

String: "pROC", "auc_pairs", or "ROCR": Method to use. Will use pROC::roc, auc_pairs, ROCR::performance, respectively. They should all give the same result, they are included for peace of mind. See Details

verbose

Logical: If TRUE, print messages to output

Details

Consider looking at Balanced Accuracy and F1 as well

Important Note: We always assume that true labels are a factor where the first level is the "positive" case, aka the event. All methods used here, "pROC", "auc_pairs", "ROCR", have been setup to expect this. This goes against the default sertting for both "pROC" and "ROCR", which will not give an AUC less than .5 because they will reorder levels. We don't want this because we believe you CAN make a classifier perform worse than chance (for research or whatnot). It can be very confusing if levels are reordered automatically and different functions give you different AUC. Also, AUC has been very popular, but I strongly recommend reporting Balanced Accuracy instead.