Function to initialize the weights and biases in a neural network. It uses the Nguyen-Widrow (1990) algorithm.
Usage
initnw(neurons,p,n,npar)
Value
A list containing initial values for weights and biases. The first \(s\) components of the list contains vectors with the initial values for
the weights and biases of the \(k\)-th neuron, i.e. \((\omega_k, b_k, \beta_1^{(k)},...,\beta_p^{(k)})'\).
Arguments
neurons
Number of neurons.
p
Number of predictors.
n
Number of cases.
npar
Number of parameters to be estimate including only weights and biases, and should be equal to \(neurons \times (1+1+p)+1\).
Details
The algorithm is described in Nguyen-Widrow (1990) and in other books, see for example Sivanandam and Sumathi (2005). The algorithm is briefly described below.
1.-Compute the scaling factor \(\theta=0.7 p^{1/n}\).
2.- Initialize the weight and biases for each neuron at random, for example generating random numbers from \(U(-0.5,0.5)\).
Update the bias \((b_k)\) generating a random number from \(U(-\theta,\theta)\).
References
Nguyen, D. and Widrow, B. 1990. "Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights",
Proceedings of the IJCNN, 3, 21-26.
Sivanandam, S.N. and Sumathi, S. 2005. Introduction to Neural Networks Using MATLAB 6.0. Ed. McGraw Hill, First edition.