Learn R Programming

ddalpha (version 1.3.16)

ddalpha.classify: Classify using DD-Classifier

Description

Classifies data using the DD-classifier and a specified outsider treatment.

Usage

ddalpha.classify(ddalpha, objects, subset, outsider.method = NULL, use.convex = NULL)

# S3 method for ddalpha predict(object, objects, subset, outsider.method = NULL, use.convex = NULL, ...)

Value

List containing class labels, or character string "Ignored" for the outsiders if "Ignore" was specified as the outsider treating method.

Arguments

ddalpha, object

DD\(\alpha\)-classifier (obtained by ddalpha.train).

objects

Matrix containing objects to be classified; each row is one \(d\)-dimensional object.

subset

an optional vector specifying a subset of observations to be classified.

outsider.method

Character string, name of a treatment to be used for outsiders; one of those trained by ddalpha.train. If the treatment was specified using the argument outsider.methods then use the name of the method.

use.convex

Logical variable indicating whether outsiders should be determined as the points not contained in any of the convex hulls of the classes from the training sample (TRUE) or those having zero depth w.r.t. each class from the training sample (FALSE). For depth = "zonoid" both values give the same result. If NULL the value specified in DD\(\alpha\)-classifier (in ddalpha.train) is used.

...

additional parameters are ignored

Details

Only one outsider treatment can be specified.

See Lange, Mosler and Mozharovskyi (2014) for details and additional information.

References

Dyckerhoff, R., Koshevoy, G., and Mosler, K. (1996). Zonoid data depth: theory and computation. In: Prat A. (ed), COMPSTAT 1996. Proceedings in computational statistics, Physica-Verlag (Heidelberg), 235--240.

Lange, T., Mosler, K., and Mozharovskyi, P. (2014). Fast nonparametric classification based on data depth. Statistical Papers 55 49--69.

Li, J., Cuesta-Albertos, J.A., and Liu, R.Y. (2012). DD-classifier: Nonparametric classification procedure based on DD-plot. Journal of the American Statistical Association 107 737--753.

Mozharovskyi, P. (2015). Contributions to Depth-based Classification and Computation of the Tukey Depth. Verlag Dr. Kovac (Hamburg).

Mozharovskyi, P., Mosler, K., and Lange, T. (2015). Classifying real-world data with the DD\(\alpha\)-procedure. Advances in Data Analysis and Classification 9 287--314.

Vasil'ev, V.I. (2003). The reduction principle in problems of revealing regularities I. Cybernetics and Systems Analysis 39 686--694.

See Also

ddalpha.train to train the DD-classifier.

Examples

Run this code
# Generate a bivariate normal location-shift classification task
# containing 200 training objects and 200 to test with
class1 <- mvrnorm(200, c(0,0), 
                  matrix(c(1,1,1,4), nrow = 2, ncol = 2, byrow = TRUE))
class2 <- mvrnorm(200, c(2,2), 
                  matrix(c(1,1,1,4), nrow = 2, ncol = 2, byrow = TRUE))
trainIndices <- c(1:100)
testIndices <- c(101:200)
propertyVars <- c(1:2)
classVar <- 3
trainData <- rbind(cbind(class1[trainIndices,], rep(1, 100)), 
                   cbind(class2[trainIndices,], rep(2, 100)))
testData <- rbind(cbind(class1[testIndices,], rep(1, 100)), 
                  cbind(class2[testIndices,], rep(2, 100)))
data <- list(train = trainData, test = testData)

# Train the DDalpha-Classifier (zonoid depth, maximum Mahalanobis depth 
# classifier with defaults as outsider treatment)
ddalpha <- ddalpha.train(data$train, 
                         depth = "zonoid", 
                         outsider.methods = "depth.Mahalanobis")
# Get the classification error rate
classes <- ddalpha.classify(data$test[,propertyVars], ddalpha, 
                            outsider.method = "depth.Mahalanobis")
cat("Classification error rate: ", 
    sum(unlist(classes) != data$test[,classVar])/200, ".\n", sep="")

Run the code above in your browser using DataLab