Learn R Programming

HDPenReg (version 0.94.9)

HDlars: Lars algorithm

Description

It performs the lars algorithm for solving lasso problem. It is a linear regression problem with a l1-penalty on the estimated coefficient.

Usage

HDlars(
  X,
  y,
  maxSteps = 3 * min(dim(X)),
  intercept = TRUE,
  eps = .Machine$double.eps^0.5
)

Value

An object of type LarsPath.

Arguments

X

the matrix (of size n*p) of the covariates.

y

a vector of length n with the response.

maxSteps

Maximal number of steps for lars algorithm.

intercept

If TRUE, add an intercept to the model.

eps

Tolerance of the algorithm.

Author

Quentin Grimonprez

Details

The l1 penalty performs variable selection via shrinkage of the estimated coefficient. It depends on a penalty parameter called lambda controlling the amount of regularization. The objective function of lasso is : $$||y-X\beta||_2 + \lambda||\beta||_1$$

References

Efron, Hastie, Johnstone and Tibshirani (2003) "Least Angle Regression" (with discussion) Annals of Statistics

See Also

LarsPath HDcvlars listToMatrix

Examples

Run this code
dataset <- simul(50, 10000, 0.4, 10, 50, matrix(c(0.1, 0.8, 0.02, 0.02), nrow = 2))
result <- HDlars(dataset$data, dataset$response)
# Obtain estimated coefficient in matrix format
coefficient <- listToMatrix(result)

Run the code above in your browser using DataLab