Learn R Programming

skedastic (version 2.0.2)

GSS: Golden Section Search for Minimising Univariate Function over a Closed Interval

Description

Golden Section Search (GSS) is a useful algorithm for minimising a continuous univariate function \(f(x)\) over an interval \(\left[a,b\right]\) in instances where the first derivative \(f'(x)\) is not easily obtainable, such as with loss functions that need to be minimised under cross-validation to tune a hyperparameter in a machine learning model. The method is described by Fox21;textualskedastic.

Usage

GSS(f, a, b, tol = 1e-08, maxitgss = 100L, ...)

Value

A list object containing the following:

  • argmin, the argument of f that minimises f

  • funmin, the minimum value of f achieved at argmin

  • converged, a logical indicating whether the convergence tolerance was satisfied

  • iterations, an integer indicating the number of search iterations used

Arguments

f

A function of one variable that returns a numeric value

a

A numeric of length 1 representing the lower endpoint of the search interval

b

A numeric of length 1 representing the upper endpoint of the search interval; must be greater than a

tol

A numeric of length 1 representing the tolerance used to determine when the search interval has been narrowed sufficiently for convergence

maxitgss

An integer of length 1 representing the maximum number of iterations to use in the search

...

Additional arguments to pass to f

Details

This function is modelled after a MATLAB function written by Zarnowiec22;textualskedastic. The method assumes that there is one local minimum in this interval. The solution produced is also an interval, the width of which can be made arbitrarily small by setting a tolerance value. Since the desired solution is a single point, the midpoint of the final interval can be taken as the best approximation to the solution.

The premise of the method is to shorten the interval \(\left[a,b\right]\) by comparing the values of the function at two test points, \(x_1 < x_2\), where \(x_1 = a + r(b-a)\) and \(x_2 = b - r(b-a)\), and \(r=(\sqrt{5}-1)/2\approx 0.618\) is the reciprocal of the golden ratio. One compares \(f(x_1)\) to \(f(x_2)\) and updates the search interval \(\left[a,b\right]\) as follows:

  • If \(f(x_1)<f(x_2)\), the solution cannot lie in \(\left[x_2,b\right]\); thus, update the search interval to $$\left[a_\mathrm{new},b_\mathrm{new}\right]=\left[a,x_2\right]$$

  • If \(f(x_1)>f(x_2)\), the solution cannot lie in \(\left[a,x_1\right]\); thus, update the search interval to $$\left[a_\mathrm{new},b_\mathrm{new}\right]=\left[x_1,b\right]$$

One then chooses two new test points by replacing \(a\) and \(b\) with \(a_\mathrm{new}\) and \(b_\mathrm{new}\) and recalculating \(x_1\) and \(x_2\) based on these new endpoints. One continues iterating in this fashion until \(b-a< \tau\), where \(\tau\) is the desired tolerance.

References

See Also

goldsect is similar to this function, but does not allow the user to pass additional arguments to f

Examples

Run this code
f <- function(x) (x - 1) ^ 2
GSS(f, a = 0, b = 2)

Run the code above in your browser using DataLab