LASSO Kullback-Leibler divergence based regression.
lasso.klcompreg(y, x, alpha = 1, lambda = NULL,
nlambda = 100, type = "grouped", xnew = NULL)
A numerical matrix with compositional data. Zero values are allowed.
A numerical matrix containing the predictor variables.
The elastic net mixing parameter, with \(0 \leq \alpha \leq 1\). The penalty is defined as a weighted combination of the ridge and of the Lasso regression. When \(\alpha=1\) LASSO is applied, while \(\alpha=0\) yields the ridge regression.
This information is copied from the package glmnet. A user supplied lambda sequence. Typical usage is to have the program compute its own lambda sequence based on nlambda and lambda.min.ratio. Supplying a value of lambda overrides this. WARNING: use with care. Avoid supplying a single value for lambda (for predictions after CV use predict() instead). Supply instead a decreasing sequence of lambda values. glmnet relies on its warms starts for speed, and its often faster to fit a whole path than compute a single fit.
This information is copied from the package glmnet. The number of \(lambda\) values, default is 100.
This information is copied from the package glmnet.. If "grouped" then a grouped lasso penalty is used on the multinomial coefficients for a variable. This ensures they are all in our out together. The default in our case is "grouped".
If you have new data use it, otherwise leave it NULL.
A list including:
We decided to keep the same list that is returned by glmnet. So, see the function in that package for more information.
If you supply a matrix in the "xnew" argument this will return an array of many matrices with the fitted values, where each matrix corresponds to each value of \(\lambda\).
The function uses the glmnet package to perform LASSO penalised regression. For more details see the function in that package.
Aitchison J. (1986). The statistical analysis of compositional data. Chapman & Hall.
Friedman, J., Hastie, T. and Tibshirani, R. (2010) Regularization Paths for Generalized Linear Models via Coordinate Descent. Journal of Statistical Software, Vol. 33(1), 1-22.
lassocoef.plot, cv.lasso.klcompreg, kl.compreg, lasso.compreg, ols.compreg, alfa.pcr, alfa.knn.reg
# NOT RUN {
y <- as.matrix(iris[, 1:4])
y <- y / rowSums(y)
x <- matrix( rnorm(150 * 30), ncol = 30 )
a <- lasso.klcompreg(y, x)
# }
Run the code above in your browser using DataLab