The fine tuning function for deep architectures. This function use the
function saved in the attribute fineTuneFunction to train the deep
architecture.
fineTuneDArch(darch, dataSet, dataSetValid = NULL, numEpochs = 1,
isClass = TRUE, stopErr = -Inf, stopClassErr = 101,
stopValidErr = -Inf, stopValidClassErr = 101, shuffleTrainData = T,
debugMode = F, ...)The number of training iterations
Indicates whether the training is for a classification net.
When TRUE then statistics for classification will be determind.
Default is TRUE
Stop criteria for the error on the train data. Default is
-Inf
Stop criteria for the classification error on the train
data. Default is 101
Stop criteria for the error on the validation data.
Default is -Inf.
Stop criteria for the classification error on the
validation data. Default is 101.
Whether to shuffle train data before each epoch.
Whether to enable debug mode, internal parameter.
Additional parameters for the training function
Whether to use bootstrapping to create validation data.
The function trains the given network darch with the function
saved in the attribute fineTuneFunction of the
'>DArch-Object. The data and classes for validation and
testing are optional. If they are provided the network will be executed
with this datasets and statistics will be calculated. This statistics are
saved in the stats attribute (see '>Net). Also it
is possible to set stop criteria for the training on the error
(stopErr, stopValidErr) or the correct classifications
(stopClassErr, stopValidClassErr) of the training or
validation dataset.
'>DArch, '>Net,
backpropagation, rpropagation,
minimizeAutoencoder, minimizeClassifier