The error functions calculate the goodness of fit of a neural network according to certain criterium:
LMS: Least Mean Squares Error.
LMLS: Least Mean Log Squares minimization.
TAO: TAO error minimization.
The deltaE functions calculate the influence functions of their error criteria.
error.LMS(arguments)
error.LMLS(arguments)
error.TAO(arguments)
deltaE.LMS(arguments)
deltaE.LMLS(arguments)
deltaE.TAO(arguments)
List of arguments to pass to the functions.
The first element is the prediction of the neuron.
The second element is the corresponding component of the target vector.
The third element is the whole net. This allows the TAO criterium to know the value of the S parameter and eventually ( next minor update) will allow the user to apply regularization criteria.
This functions return the error and influence function criteria.
Pern<U+00ED>a Espinoza, A.V., Ordieres Mer<U+00E9>, J.B., Mart<U+00ED>nez de Pis<U+00F3>n, F.J., Gonz<U+00E1>lez Marcos, A. TAO-robust backpropagation learning algorithm. Neural Networks. Vol. 18, Issue 2, pp. 191--204, 2005. Simon Haykin. Neural Networks -- a Comprehensive Foundation. Prentice Hall, New Jersey, 2nd edition, 1999. ISBN 0-13-273350-1.