Learn R Programming

pracma (version 1.5.5)

fletcher-powell: Davidon-Fletcher-Powell Method

Description

Davidon-Fletcher-Powell method for function minimization.

The Davidon-Fletcher-Powell (DFP) and the Broyden-Fletcher-Goldfarb-Shanno (BFGS) methods are the first quasi-Newton minimization methods developed. These methods differ only in some details; in general, the BFGS approach is more robust.

Usage

fletcher_powell(x0, f, g = NULL,
                maxiter = 1000, tol = .Machine$double.eps^(2/3))

Arguments

x0
start value.
f
function to be minimized.
g
gradient function of f; if NULL, a numerical gradient will be calculated.
maxiter
max. number of iterations.
tol
relative tolerance, to be used as stopping rule.

Value

  • List with following components:
  • xminminimum solution found.
  • fminvalue of f at minimum.
  • niternumber of iterations performed.

Details

The starting point is Newton's method in the multivariate case, when the estimate of the minimum is updated by the following equation $$x_{new} = x - H^{-1}(x) grad(g)(x)$$ where $H$ is the Hessian and $grad$ the gradient.

The basic idea is to generate a sequence of good approximations to the inverse Hessian matrix, in such a way that the approximations are again positive definite.

References

J. F. Bonnans, J. C. Gilbert, C. Lemarechal, and C. A. Sagastizabal. Numerical Optimization: Theoretical and Practical Aspects. Second Edition, Springer-Verlag, Berlin Heidelberg, 2006.

See Also

steep_descent

Examples

Run this code
##  Rosenbrock function
rosenbrock <- function(x) {
    n <- length(x)
    x1 <- x[2:n]
    x2 <- x[1:(n-1)]
    sum(100*(x1-x2^2)^2 + (1-x2)^2)
}
fletcher_powell(c(0, 0), rosenbrock)
# $xmin
# [1] 1 1
# $fmin
# [1] 1.774148e-27
# $niter
# [1] 14

Run the code above in your browser using DataLab