Learn R Programming

kernDeepStackNet (version 2.0.2)

tuneMboLevelGcvKDSN: Tuning of KDSN with efficient global optimization given level by cross-validation

Description

Implements the efficient global optimization algorithm based on Kriging for kernel deep stacking networks (KDSN). This function uses generalized cross validation approximation of out of sample error without using external data. Preselection of variables is based on the randomized dependence coefficient (RDC).

Usage

tuneMboLevelGcvKDSN(y, X, levels=1, alpha=rep(0, levels), fineTuneIt=100, 
                    nStepMult=20, designMult=10, 
                    dimMax=round(sqrt(dim(X)[1])/2), addInfo=TRUE,
                    maxRunsMult=1, repMult=1, 
                    tol_input=.Machine$double.eps^0.25, 
                    EIopt="1Dmulti", GenSAmaxCall=100,
                    varSelect=rep(FALSE, levels), rdcRep=1, 
                    dropHidden=rep(FALSE, levels),
                    standX=TRUE, standY=FALSE, timeAlloc="constant", 
                    varPreSelect=FALSE, varPreSelpopSize=100, varPreSelMaxiter=100,
                    EItype="EQI")

Arguments

y

Response matrix with one column.

X

Design matrix. All factors must be already encoded.

levels

Maximum number of levels for the kernel deep stacking network (integer scalar).

alpha

Weight parameter between lasso and ridge penalty (numeric vector) of each level. Default=0 corresponds to ridge penalty and 1 equals lasso.

fineTuneIt

Number of drawn random weight matrices in fine tuning (integer scalar). If set to zero, no fine tuning is done.

nStepMult

Multiplier, which affects how many steps the EGO algorithm is run, depending on the number of parameters to estimate.

designMult

Multiplier, which affects how many initial design points are evaluated in the loss function, depending on the number of parameters to estimate.

dimMax

Maximal dimension of the random Fourier transformation. The effective number of parameters is dimMax*2. The default heuristic depends on the sample size.

addInfo

Should additional information be printed during estimation? Default is TRUE.

maxRunsMult

Multiplies the base number of iterations in the conditional one dimensional optimization. Default is to use the number of hyperparameters. See optimize1dMulti.

repMult

Multiplies the base number of random starting values in the conditional one dimensional optimization to avoid local optima. Default is to use the number of hyperparameters. See optimize1dMulti.

tol_input

Convergence criteria of each one dimensional sub-optimization. Higher values will be more accurate, but require much more function evaluations. Default is the fourth root of the machine double accuracy. See optimize1dMulti.

EIopt

Specifies which algorithm is used to optimize the expected improvement criterion. Two alternatives are available "1Dmulti" and "GenSA". The former uses the conditional 1D algorithm and the latter generalized, simulated annealing.

GenSAmaxCall

Maximum number of function calls per parameter to estimate in generalized, simulated annealing. Higher values result in more accurate estimates, but the optimization process is slowed.

varSelect

Specifies, if variables should be preselected by using randomized, dependence coefficient. Default is no variable selection in all levels.

rdcRep

Number of repetitions for the randomized dependence coefficient rdcVarOrder.

dropHidden

Should dropout be applied on the random Fourier transformation? Each entry corresponds to the one level. Default is without dropout (logical vector).

standX

Should the design matrix be standardized by median and median absolute deviation? Default is TRUE.

standY

Should the response be standardized by median and median absolute deviation? Default is FALSE.

timeAlloc

Specifies how the new noise variance is influenced by iteration progress. Default is to use "constant"" allocation. The other available option is to specify "zero", which sets the future noise variance always to zero.

varPreSelect

Should variables be pre-selected using RDC and genetic algorithm? Default is no. May consume a lot of start up time.

varPreSelpopSize

Population size of the genetic algorithm (integer scalar).

varPreSelMaxiter

Maximum number of generations of the genetic algorithm (integer scalar).

EItype

Defines the type of the improvement criterion. The default EQI corresponds to the expected quantile improvement. As an alternative EI expected improvement is also possible.

Value

Gives the best tuned kernel deep stacking network of class k_DSN_rft given a specific level (see fitKDSN).

References

David Lopez-Paz and Philipp Hennig and Bernhard Schoelkopf, (2013), The Randomized Dependence Coefficient, Max Planck Institute for Intelligent Systems, Germany

Victor Picheny, David Ginsbourger, Yann Richet, (2012), Quantile-based optimization of Noisy Computer Experiments with Tunable Precision, HAL-archives-ouvertes.fr, hal-00578550v3

See Also

km, leaveOneOut.km, maximinLHS, mboAll, mbo1d

Examples

Run this code
# Generate small sample of 20 observations of a binary classification task
# Due to keeping the example as fast as possible, the parameters of the tuning 
# algorithm are set for low accuracy. Higher values of tol_input, fineTuneIt, 
# maxRuns, repetitions will increase performance considerably.
library(pROC)

# Generate design matrix
sampleSize <- 20
X <- matrix(0, nrow=sampleSize, ncol=5)
for(j in 1:5) {
  set.seed (j)
  X [, j] <- rnorm(sampleSize)
}

# Generate response of binary problem with sum(X) > 0 -> 1 and 0 elsewhere
set.seed(-1)
error <- rnorm (sampleSize)
y <- ifelse((rowSums(X) + error) > 0, 1, 0)

# Generate test data
Xtest <- matrix(, nrow=sampleSize, ncol=5)
for(j in 1:5) {
  set.seed (j*2+1)
  Xtest [, j] <- rnorm(sampleSize)
}

# Generate test response
set.seed(-10)
error <- rnorm (sampleSize)
ytest <- ifelse((rowSums(Xtest) + error) > 0, 1, 0)

# Tune kernel deep stacking network by auc on test data

tuned_KDSN_EGO_level <- tuneMboLevelGcvKDSN (y=y, X=X, 
levels=2, fineTuneIt=10, nStepMult=2, designMult=5)
preds <- predict(tuned_KDSN_EGO_level, newx=Xtest)
library(pROC)
auc(response=ytest, predictor=c(preds))

Run the code above in your browser using DataLab