Davidon-Fletcher-Powell method for function minimization.
The Davidon-Fletcher-Powell (DFP) and the Broyden-Fletcher-Goldfarb-Shanno
(BFGS) methods are the first quasi-Newton minimization methods developed.
These methods differ only in some details; in general, the BFGS approach
is more robust.
Usage
fletcher_powell(x0, f, g = NULL,
maxiter = 1000, tol = .Machine$double.eps^(2/3))
Arguments
x0
start value.
f
function to be minimized.
g
gradient function of f;
if NULL, a numerical gradient will be calculated.
maxiter
max. number of iterations.
tol
relative tolerance, to be used as stopping rule.
Value
List with following components:
xminminimum solution found.
fminvalue of f at minimum.
niternumber of iterations performed.
Details
The starting point is Newton's method in the multivariate case, when
the estimate of the minimum is updated by the following equation
$$x_{new} = x - H^{-1}(x) grad(g)(x)$$
where $H$ is the Hessian and $grad$ the gradient.
The basic idea is to generate a sequence of good approximations to the
inverse Hessian matrix, in such a way that the approximations are again
positive definite.
References
J. F. Bonnans, J. C. Gilbert, C. Lemarechal, and C. A. Sagastizabal.
Numerical Optimization: Theoretical and Practical Aspects. Second Edition,
Springer-Verlag, Berlin Heidelberg, 2006.