Learn R Programming

nloptr (version 2.0.3)

slsqp: Sequential Quadratic Programming (SQP)

Description

Sequential (least-squares) quadratic programming (SQP) algorithm for nonlinearly constrained, gradient-based optimization, supporting both equality and inequality constraints.

Usage

slsqp(
  x0,
  fn,
  gr = NULL,
  lower = NULL,
  upper = NULL,
  hin = NULL,
  hinjac = NULL,
  heq = NULL,
  heqjac = NULL,
  nl.info = FALSE,
  control = list(),
  ...
)

Arguments

x0

starting point for searching the optimum.

fn

objective function that is to be minimized.

gr

gradient of function fn; will be calculated numerically if not specified.

lower, upper

lower and upper bound constraints.

hin

function defining the inequality constraints, that is hin>=0 for all components.

hinjac

Jacobian of function hin; will be calculated numerically if not specified.

heq

function defining the equality constraints, that is heq==0 for all components.

heqjac

Jacobian of function heq; will be calculated numerically if not specified.

nl.info

logical; shall the original NLopt info been shown.

control

list of options, see nl.opts for help.

...

additional arguments passed to the function.

Value

List with components:

par

the optimal solution found so far.

value

the function value corresponding to par.

iter

number of (outer) iterations, see maxeval.

convergence

integer code indicating successful completion (> 1) or a possible error number (< 0).

message

character string produced by NLopt and giving additional information.

Details

The algorithm optimizes successive second-order (quadratic/least-squares) approximations of the objective function (via BFGS updates), with first-order (affine) approximations of the constraints.

References

Dieter Kraft, ``A software package for sequential quadratic programming'', Technical Report DFVLR-FB 88-28, Institut fuer Dynamik der Flugsysteme, Oberpfaffenhofen, July 1988.

See Also

alabama::auglag, Rsolnp::solnp, Rdonlp2::donlp2

Examples

Run this code
# NOT RUN {
##  Solve the Hock-Schittkowski problem no. 100
x0.hs100 <- c(1, 2, 0, 4, 0, 1, 1)
fn.hs100 <- function(x) {
    (x[1]-10)^2 + 5*(x[2]-12)^2 + x[3]^4 + 3*(x[4]-11)^2 + 10*x[5]^6 +
                  7*x[6]^2 + x[7]^4 - 4*x[6]*x[7] - 10*x[6] - 8*x[7]
}
hin.hs100 <- function(x) {
    h <- numeric(4)
    h[1] <- 127 - 2*x[1]^2 - 3*x[2]^4 - x[3] - 4*x[4]^2 - 5*x[5]
    h[2] <- 282 - 7*x[1] - 3*x[2] - 10*x[3]^2 - x[4] + x[5]
    h[3] <- 196 - 23*x[1] - x[2]^2 - 6*x[6]^2 + 8*x[7]
    h[4] <- -4*x[1]^2 - x[2]^2 + 3*x[1]*x[2] -2*x[3]^2 - 5*x[6]	+11*x[7]
    return(h)
}

S <- slsqp(x0.hs100, fn = fn.hs100,     # no gradients and jacobians provided
           hin = hin.hs100,
           control = list(xtol_rel = 1e-8, check_derivatives = TRUE))
S
## Optimal value of objective function:  690.622270249131   *** WRONG ***

# Even the numerical derivatives seem to be too tight.
# Let's try with a less accurate jacobian.

hinjac.hs100 <- function(x) nl.jacobian(x, hin.hs100, heps = 1e-2)
S <- slsqp(x0.hs100, fn = fn.hs100,
           hin = hin.hs100, hinjac = hinjac.hs100,
           control = list(xtol_rel = 1e-8))
S
## Optimal value of objective function:  680.630057392593   *** CORRECT ***

# }

Run the code above in your browser using DataLab