Learn R Programming

deeplearning (version 0.1.0)

finetune_SGD_bn: Updates a deep neural network's parameters using stochastic gradient descent method and batch normalization

Description

This function finetunes a DArch network using SGD approach

Usage

finetune_SGD_bn(darch, trainData, targetData, learn_rate_weight = exp(-10), learn_rate_bias = exp(-10), learn_rate_gamma = exp(-10), errorFunc = meanSquareErr, with_BN = T)

Arguments

darch
a darch instance
trainData
training input
targetData
training target
learn_rate_weight
leanring rate for the weight matrices
learn_rate_bias
learning rate for the biases
learn_rate_gamma
learning rate for the gammas
errorFunc
the error function to minimize during training
with_BN
logical value, T to train the neural net with batch normalization

Value

a darch instance with parameters updated with stochastic gradient descent