Given a function for computing a metric in metric_func
, this function
smoothes the function of metric value per cutpoint using smoothing splines. Then it
optimizes the metric by selecting an optimal cutpoint. For further details
on the smoothing spline see ?stats::smooth.spline
.
The metric
function should accept the following inputs:
tp
: vector of number of true positives
fp
: vector of number of false positives
tn
: vector of number of true negatives
fn
: vector of number of false negatives
maximize_spline_metric(
data,
x,
class,
metric_func = youden,
pos_class = NULL,
neg_class = NULL,
direction,
w = NULL,
df = NULL,
spar = 1,
nknots = cutpoint_knots,
df_offset = NULL,
penalty = 1,
control_spar = list(),
tol_metric,
use_midpoints,
...
)minimize_spline_metric(
data,
x,
class,
metric_func = youden,
pos_class = NULL,
neg_class = NULL,
direction,
w = NULL,
df = NULL,
spar = 1,
nknots = cutpoint_knots,
df_offset = NULL,
penalty = 1,
control_spar = list(),
tol_metric,
use_midpoints,
...
)
A data frame or tibble in which the columns that are given in x and class can be found.
(character) The variable name to be used for classification, e.g. predictions or test values.
(character) The variable name indicating class membership.
(function) A function that computes a metric to be optimized. See description.
The value of class that indicates the positive class.
The value of class that indicates the negative class.
(character) Use ">=" or "<=" to select whether an x value >= or <= the cutoff predicts the positive class.
Optional vector of weights of the same length as x; defaults to all 1.
The desired equivalent number of degrees of freedom (trace of the smoother matrix). Must be in (1,nx], nx the number of unique x values.
Smoothing parameter, typically (but not necessarily) in (0,1]. When spar is specified, the coefficient lambda of the integral of the squared second derivative in the fit (penalized log likelihood) criterion is a monotone function of spar.
Integer or function giving the number of knots. The function should accept data and x (the name of the predictor variable) as inputs. By default nknots = 0.1 * log(n_dat / n_cut) * n_cut where n_dat is the number of observations and n_cut the number of unique predictor values.
Allows the degrees of freedom to be increased by df_offset in the GCV criterion.
The coefficient of the penalty for degrees of freedom in the GCV criterion.
Optional list with named components controlling the root finding when the smoothing parameter spar is computed, i.e., NULL. See help("smooth.spline") for further information.
All cutpoints will be returned that lead to a metric value in the interval [m_max - tol_metric, m_max + tol_metric] where m_max is the maximum achievable metric value. This can be used to return multiple decent cutpoints and to avoid floating-point problems.
(logical) If TRUE (default FALSE) the returned optimal cutpoint will be the mean of the optimal cutpoint and the next highest observation (for direction = ">") or the next lowest observation (for direction = "<") which avoids biasing the optimal cutpoint.
Further arguments that will be passed to metric_func.
A tibble with the columns optimal_cutpoint
, the corresponding metric
value and roc_curve
, a nested tibble that includes all possible cutoffs
and the corresponding numbers of true and false positives / negatives and
all corresponding metric values.
The above inputs are arrived at by using all unique values in x
, Inf, and
-Inf as possible cutpoints for classifying the variable in class.
Other method functions:
maximize_boot_metric()
,
maximize_gam_metric()
,
maximize_loess_metric()
,
maximize_metric()
,
oc_manual()
,
oc_mean()
,
oc_median()
,
oc_youden_kernel()
,
oc_youden_normal()
# NOT RUN {
oc <- cutpointr(suicide, dsi, suicide, gender, method = maximize_spline_metric,
df = 5, metric = accuracy)
plot_metric(oc)
# }
Run the code above in your browser using DataLab