Learn R Programming

smoothedLasso (version 1.6)

minimizeSmoothedSequence: Minimize the objective function of a smoothed regression operator with respect to \(betavector\) using the progressive smoothing algorithm.

Description

Minimize the objective function of a smoothed regression operator with respect to \(betavector\) using the progressive smoothing algorithm.

Usage

minimizeSmoothedSequence(p, obj, objgrad, muSeq = 2^seq(3, -6))

Arguments

p

The dimension of the unknown parameters (regression coefficients).

obj

The objective function of the regression operator. Note that in the case of the progressive smoothing algorithm, the objective function must be a function of both \(betavector\) and \(mu\).

objgrad

The gradient function of the regression operator. Note that in the case of the progressive smoothing algorithm, the gradient must be a function of both \(betavector\) and \(mu\).

muSeq

The sequence of Nesterov smoothing parameters. The default is \(2^{-n}\) for \(n \in \{-3,\ldots,6\}\).

Value

The estimator \(betavector\) (minimizer) of the regression operator.

References

Hahn, G., Lutz, S., Laha, N., and Lange, C. (2020). A framework to efficiently smooth L1 penalties for linear regression. bioRxiv:2020.09.17.301788.

Examples

Run this code
# NOT RUN {
library(smoothedLasso)
n <- 100
p <- 500
betavector <- runif(p)
X <- matrix(runif(n*p),nrow=n,ncol=p)
y <- X %*% betavector
lambda <- 1
temp <- standardLasso(X,y,lambda)
obj <- function(z,m) objFunctionSmooth(z,temp$u,temp$v,temp$w,mu=m)
objgrad <- function(z,m) objFunctionSmoothGradient(z,temp$w,temp$du,temp$dv,temp$dw,mu=m)
print(minimizeSmoothedSequence(p,obj,objgrad))

# }

Run the code above in your browser using DataLab