powered by
A function to test gradient evaluation of a neural network by comparing it with central finite differencing.
NNgrad_test(net, loss = Qloss(), eps = 1e-05)
the exact (computed via backpropagation) and approximate (via central finite differencing) gradients and also a plot of one against the other.
an object of class network, see ?network
a loss function to compute, see ?Qloss, ?multinomial
small value used in the computation of the finite differencing. Default value is 0.00001
Ian Goodfellow, Yoshua Bengio, Aaron Courville, Francis Bach. Deep Learning. (2016)
Terrence J. Sejnowski. The Deep Learning Revolution (The MIT Press). (2018)
Neural Networks YouTube playlist by 3brown1blue: https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi
http://neuralnetworksanddeeplearning.com/
network, train, backprop_evaluate, MLP_net, backpropagation_MLP, logistic, ReLU, smoothReLU, ident, softmax, Qloss, multinomial, NNgrad_test, weights2list, bias2list, biasInit, memInit, gradInit, addGrad, nnetpar, nbiaspar, addList, no_regularisation, L1_regularisation, L2_regularisation
net <- network( dims = c(5,10,2), activ=list(ReLU(),softmax())) NNgrad_test(net)
Run the code above in your browser using DataLab