An implementation of LARS: Least Angle Regression (Stagewise/laSso). This is
a stage-wise homotopy-based algorithm for L1-regularized linear regression
(LASSO) and L1+L2-regularized linear regression (Elastic Net).
This program is able to train a LARS/LASSO/Elastic Net model or load a model
from file, output regression predictions for a test set, and save the trained
model to a file. The LARS algorithm is described in more detail below:
Let X be a matrix where each row is a point and each column is a dimension,
and let y be a vector of targets.
The Elastic Net problem is to solve
min_beta 0.5 || X * beta - y ||_2^2 + lambda_1 ||beta||_1 +
0.5 lambda_2 ||beta||_2^2
If lambda1 > 0 and lambda2 = 0, the problem is the LASSO.
If lambda1 > 0 and lambda2 > 0, the problem is the Elastic Net.
If lambda1 = 0 and lambda2 > 0, the problem is ridge regression.
If lambda1 = 0 and lambda2 = 0, the problem is unregularized linear
regression.
For efficiency reasons, it is not recommended to use this algorithm with
"lambda1" = 0. In that case, use the 'linear_regression' program, which
implements both unregularized linear regression and ridge regression.
To train a LARS/LASSO/Elastic Net model, the "input" and "responses"
parameters must be given. The "lambda1", "lambda2", and "use_cholesky"
parameters control the training options. A trained model can be saved with
the "output_model". If no training is desired at all, a model can be passed
via the "input_model" parameter.
The program can also provide predictions for test data using either the
trained model or the given input model. Test points can be specified with
the "test" parameter. Predicted responses to the test points can be saved
with the "output_predictions" output parameter.