Learn R Programming

kknn (version 1.4.0)

predict.train.kknn: Training kknn

Description

Training of kknn method via leave-one-out (train.kknn) or k-fold (cv.kknn) cross-validation.

Usage

# S3 method for train.kknn
predict(object, newdata, ...)

train.kknn( formula, data, kmax = 11, ks = NULL, distance = 2, kernel = "optimal", ykernel = NULL, scale = TRUE, contrasts = c(unordered = "contr.dummy", ordered = "contr.ordinal"), ... )

# S3 method for train.kknn print(x, ...)

# S3 method for train.kknn summary(object, ...)

# S3 method for train.kknn plot(x, ...)

cv.kknn(formula, data, kcv = 10, ...)

Value

train.kknn returns a list-object of class train.kknn

including the components.

MISCLASS

Matrix of misclassification errors.

MEAN.ABS

Matrix of mean absolute errors.

MEAN.SQU

Matrix of mean squared errors.

fitted.values

List of predictions for all combinations of kernel and k.

best.parameters

List containing the best parameter value for kernel and k.

response

Type of response variable, one of continuous, nominal or ordinal.

distance

Parameter of Minkowski distance.

call

The matched call.

terms

The 'terms' object used.

Arguments

object

a model object for which prediction is desired.

newdata

A data frame in which to look for variables with which to predict.

...

Further arguments passed to or from other methods.

formula

A formula object.

data

Matrix or data frame.

kmax

Maximum number of k, if ks is not specified.

ks

A vector specifying values of k. If not null, this takes precedence over kmax.

distance

Parameter of Minkowski distance.

kernel

Kernel to use. Possible choices are "rectangular" (which is standard unweighted knn), "triangular", "epanechnikov" (or beta(2,2)), "biweight" (or beta(3,3)), "triweight" (or beta(4,4)), "cos", "inv", "gaussian" and "optimal".

ykernel

Window width of an y-kernel, especially for prediction of ordinal classes.

scale

logical, scale variable to have equal sd.

contrasts

A vector containing the 'unordered' and 'ordered' contrasts to use.

x

an object of class train.kknn

kcv

Number of partitions for k-fold cross validation.

Author

Klaus P. Schliep klaus.schliep@gmail.com

Details

train.kknn performs leave-one-out cross-validation and is computationally very efficient. cv.kknn performs k-fold cross-validation and is generally slower and does not yet contain the test of different models yet.

References

Hechenbichler K. and Schliep K.P. (2004) Weighted k-Nearest-Neighbor Techniques and Ordinal Classification, Discussion Paper 399, SFB 386, Ludwig-Maximilians University Munich (tools:::Rd_expr_doi("10.5282/ubm/epub.1769"))

Hechenbichler K. (2005) Ensemble-Techniken und ordinale Klassifikation, PhD-thesis

Samworth, R.J. (2012) Optimal weighted nearest neighbour classifiers. Annals of Statistics, 40, 2733-2763. (available from http://www.statslab.cam.ac.uk/~rjs57/Research.html)

See Also

kknn

Examples

Run this code

library(kknn)
if (FALSE) {
data(miete)
(train.con <- train.kknn(nmqm ~ wfl + bjkat + zh, data = miete, 
	kmax = 25, kernel = c("rectangular", "triangular", "epanechnikov",
	"gaussian", "rank", "optimal")))
plot(train.con)
(train.ord <- train.kknn(wflkat ~ nm + bjkat + zh, miete, kmax = 25,
 	kernel = c("rectangular", "triangular", "epanechnikov", "gaussian", 
 	"rank", "optimal")))
plot(train.ord)
(train.nom <- train.kknn(zh ~ wfl + bjkat + nmqm, miete, kmax = 25, 
	kernel = c("rectangular", "triangular", "epanechnikov", "gaussian", 
	"rank", "optimal")))
plot(train.nom)
}
data(glass)
glass <- glass[,-1]
(fit.glass1 <- train.kknn(Type ~ ., glass, kmax = 15, kernel = 
	c("triangular", "rectangular", "epanechnikov", "optimal"), distance = 1))
(fit.glass2 <- train.kknn(Type ~ ., glass, kmax = 15, kernel = 
	c("triangular", "rectangular", "epanechnikov", "optimal"), distance = 2))
plot(fit.glass1)
plot(fit.glass2)

Run the code above in your browser using DataLab