Learn R Programming

DiceEval (version 1.6.1)

modelComparison: Comparison of different types of metamodels

Description

modelComparison fits different metamodels and returns R2 and RMSE criteria relating to each.

Usage

modelComparison(X,Y, type="all",K=10,test=NULL,...)

Value

A list containing two fields if the argument test equal NULL and three fields otherwise :

Learning

R2 and RMSE criteria evaluated from learning set,

CV

Q2 and RMSE_CV criteria using K-fold cross-validation,

Test

R2 and RMSE criteria on the test set.

A graphical tool to compare the value of the criteria is proposed.

Arguments

X

a data.frame containing the design of experiments

Y

a vector containing the response variable

type

a vector containing the type of models to compare.

The default value is "all"=c("Linear", "StepLinear","Additive", "PolyMARS", "MARS","Kriging")

K

the number of folds for cross-validation (default value is set at 10)

test

a data.frame containing the design and the response of a test set when available, the prediction criteria will be evaluated on the test design (default corresponds to no test set)

...

according to the type argument, parameters can be specified (for example, formula and penalty for a stepwise procedure)

Author

D. Dupuy

See Also

modelFit and crossValidation

Examples

Run this code
if (FALSE) {
data(dataIRSN5D)
X <- dataIRSN5D[,1:5]
Y <- dataIRSN5D[,6]
data(testIRSN5D)
library(gam)
library(mda)
library(polspline)
crit  <- modelComparison(X,Y, type="all",test=testIRSN5D)

crit2 <- modelComparison(X,Y, type=rep("StepLinear",5),test=testIRSN5D,
		penalty=c(1,2,5,10,20),formula=Y~.^2)
    }

Run the code above in your browser using DataLab