Cross-validation for the LASSO Kullback-Leibler divergence based regression.
cv.lasso.klcompreg(y, x, alpha = 1, type = "grouped", nfolds = 10,
folds = NULL, seed = NULL, graph = FALSE)
A numerical matrix with compositional data with or without zeros.
A matrix with the predictor variables.
The elastic net mixing parameter, with \(0 \leq \alpha \leq 1\). The penalty is defined as a weighted combination of the ridge and of the Lasso regression. When \(\alpha=1\) LASSO is applied, while \(\alpha=0\) yields the ridge regression.
This information is copied from the package glmnet.. If "grouped" then a grouped lasso penalty is used on the multinomial coefficients for a variable. This ensures they are all in our out together. The default in our case is "grouped".
The number of folds for the K-fold cross validation, set to 10 by default.
If you have the list with the folds supply it here. You can also leave it NULL and it will create folds.
You can specify your own seed number here or leave it NULL.
If graph is TRUE (default value) a filled contour plot will appear.
The outcome is the same as in the R package glmnet. The extra addition is that if "graph = TRUE", then the plot of the cross-validated object is returned. The contains the logarithm of \(\lambda\) and the deviance. The numbers on top of the figure show the number of set of coefficients for each component, that are not zero.
The K-fold cross validation is performed in order to select the optimal value for \(\lambda\), the penalty parameter in LASSO.
Friedman, J., Hastie, T. and Tibshirani, R. (2010) Regularization Paths for Generalized Linear Models via Coordinate Descent. Journal of Statistical Software, Vol. 33(1), 1-22.
lasso.klcompreg, lassocoef.plot, lasso.compreg, cv.lasso.compreg, kl.compreg
# NOT RUN {
library(MASS)
y <- rdiri( 214, runif(4, 1, 3) )
x <- as.matrix( fgl[, 2:9] )
mod <- cv.lasso.klcompreg(y, x)
# }
Run the code above in your browser using DataLab