Learn R Programming

maxLik (version 1.5-2.1)

numericGradient: Functions to Calculate Numeric Derivatives

Description

Calculate (central) numeric gradient and Hessian, including of vector-valued functions.

Usage

numericGradient(f, t0, eps=1e-06, fixed, ...)
numericHessian(f, grad=NULL, t0, eps=1e-06, fixed, ...)
numericNHessian(f, t0, eps=1e-6, fixed, ...)

Value

Matrix. For numericGradient, the number of rows is equal to the length of the function value vector, and the number of columns is equal to the length of the parameter vector.

For the numericHessian, both numer of rows and columns is equal to the length of the parameter vector.

Arguments

f

function to be differentiated. The first argument must be the parameter vector with respect to which it is differentiated. For numeric gradient, f may return a (numeric) vector, for Hessian it should return a numeric scalar

grad

function, gradient of f

t0

vector, the parameter values

eps

numeric, the step for numeric differentiation

fixed

logical index vector, fixed parameters. Derivative is calculated only with respect to the parameters for which fixed == FALSE, NA is returned for the fixed parameters. If missing, all parameters are treated as active.

...

furter arguments for f

Warning

Be careful when using numerical differentiation in optimization routines. Although quite precise in simple cases, they may work very poorly in more complicated conditions.

Author

Ott Toomet

Details

numericGradient numerically differentiates a (vector valued) function with respect to it's (vector valued) argument. If the functions value is a \(N_{val} \times 1\) vector and the argument is \(N_{par} \times 1\) vector, the resulting gradient is a \(N_{val} \times N_{par}\) matrix.

numericHessian checks whether a gradient function is present. If yes, it calculates the gradient of the gradient, if not, it calculates the full numeric Hessian (numericNHessian).

See Also

compareDerivatives, deriv

Examples

Run this code
# A simple example with Gaussian bell surface
f0 <- function(t0) exp(-t0[1]^2 - t0[2]^2)
numericGradient(f0, c(1,2))
numericHessian(f0, t0=c(1,2))

# An example with the analytic gradient
gradf0 <- function(t0) -2*t0*f0(t0)
numericHessian(f0, gradf0, t0=c(1,2))
# The results should be similar as in the previous case

# The central numeric derivatives are often quite precise
compareDerivatives(f0, gradf0, t0=1:2)
# The difference is around 1e-10

Run the code above in your browser using DataLab