Learn R Programming

chemometrics (version 1.4.4)

nnetEval: Neural network evaluation by CV

Description

Evaluation for Artificial Neural Network (ANN) classification by cross-validation

Usage

nnetEval(X, grp, train, kfold = 10, decay = seq(0, 10, by = 1), size = 30, 
maxit = 100, plotit = TRUE, legend = TRUE, legpos = "bottomright", ...)

Value

trainerr

training error rate

testerr

test error rate

cvMean

mean of CV errors

cvSe

standard error of CV errors

cverr

all errors from CV

decay

value(s) for weight decay, taken from input

size

value(s) for number of hidden units, taken from input

Arguments

X

standardized complete X data matrix (training and test data)

grp

factor with groups for complete data (training and test data)

train

row indices of X indicating training data objects

kfold

number of folds for cross-validation

decay

weight decay, see nnet, can be a vector with several values - but then "size" can be only one value

size

number of hidden units, see nnet, can be a vector with several values - but then "decay" can be only one value

maxit

maximal number of iterations for ANN, see nnet

plotit

if TRUE a plot will be generated

legend

if TRUE a legend will be added to the plot

legpos

positioning of the legend in the plot

...

additional plot arguments

Author

Peter Filzmoser <P.Filzmoser@tuwien.ac.at>

Details

The data are split into a calibration and a test data set (provided by "train"). Within the calibration set "kfold"-fold CV is performed by applying the classification method to "kfold"-1 parts and evaluation for the last part. The misclassification error is then computed for the training data, for the CV test data (CV error) and for the test data.

References

K. Varmuza and P. Filzmoser: Introduction to Multivariate Statistical Analysis in Chemometrics. CRC Press, Boca Raton, FL, 2009.

See Also

Examples

Run this code
data(fgl,package="MASS")
grp=fgl$type
X=scale(fgl[,1:9])
k=length(unique(grp))
dat=data.frame(grp,X)
n=nrow(X)
ntrain=round(n*2/3)
require(nnet)
set.seed(123)
train=sample(1:n,ntrain)
resnnet=nnetEval(X,grp,train,decay=c(0,0.01,0.1,0.15,0.2,0.3,0.5,1),
   size=20,maxit=20)

Run the code above in your browser using DataLab