Learn R Programming

CMA (version 1.30.0)

Planarplot: Visualize Separability of different classes

Description

Given two variables, the methods trains a classifier (argument classifier) based on these two variables and plots the resulting class regions, learning- and test observations in the plane.

Appropriate variables are usually found by GeneSelection.

For S4 method information, s. Planarplot-methods.

Usage

Planarplot(X, y, f, learnind, predind, classifier, gridsize = 100, ...)

Arguments

X
Gene expression data. Can be one of the following:
  • A matrix. Rows correspond to observations, columns to variables.
  • A data.frame, when f is not missing (s. below).
  • An object of class ExpressionSet.

y
Class labels. Can be one of the following:
  • A numeric vector.
  • A factor.
  • A character if X is an ExpressionSet that specifies the phenotype variable.
  • missing, if X is a data.frame and a proper formula f is provided.

f
A two-sided formula, if X is a data.frame. The left part correspond to class labels, the right to variables.
learnind
An index vector specifying the observations that belong to the learning set. May be missing; in that case, the learning set consists of all observations and predictions are made on the learning set.
predind
A vector containing exactly two indices that denote the two variables used for classification.
classifier
Name of function ending with CMA indicating the classifier to be used.
gridsize
The gridsize used for two-dimensional plotting.

For both variables specified in predind, an equidistant grid of size gridsize is created. The resulting two grids are then combined to obtain gridsize^2 points in the real plane which are used to draw the class regions. Defaults to 100 which is usually a reasonable choice, but takes some time.

...
Further argument passed to classifier.

Value

See Also

GeneSelection, compBoostCMA, dldaCMA, ElasticNetCMA, fdaCMA, flexdaCMA, gbmCMA, knnCMA, ldaCMA, LassoCMA, nnetCMA, pknnCMA, plrCMA, pls_ldaCMA, pls_lrCMA, pls_rfCMA, pnnCMA, qdaCMA, rfCMA, scdaCMA, shrinkldaCMA, svmCMA

Examples

Run this code
### simple linear discrimination for the golub data:
data(golub)
golubY <- golub[,1]
golubX <- as.matrix(golub[,-1])
golubn <- nrow(golubX)
set.seed(111)
learnind <- sample(golubn, size=floor(2/3*golubn))
Planarplot(X=golubX, y=golubY, learnind=learnind, predind=c(2,4),
           classifier=ldaCMA)

Run the code above in your browser using DataLab