Learn R Programming

plsdof (version 0.2-1)

pls.ic: Model selection for Partial Least Squares based on information criteria

Description

This function computes the optimal model parameters using one of three different model selection criteria (aic, bic, gmdl) and based on two different Degrees of Freedom estimates for PLS.

Usage

pls.ic(X, y, m,criterion="bic",naive,use.kernel,compute.jacobian,verbose)

Arguments

X
matrix of predictor observations.
y
vector of response observations. The length of y is the same as the number of rows of X.
m
maximal number of Partial Least Squares components. Default is m=ncol(X).
criterion
Choice of the model selection criterion. One of the three options aic, bic, gmdl.
naive
Use the naive estimate for the Degrees of Freedom? Default is FALSE.
use.kernel
Use kernel representation? Default is use.kernel=FALSE.
compute.jacobian
Should the first derivative of the regression coefficients be computed as well? Default is FALSE
verbose
If TRUE, the function prints a warning if the algorithms produce negative Degrees of Freedom. Default is TRUE.

Value

  • The function returns an object of class "plsdof".
  • DoFDegrees of Freedom
  • m.optoptimal number of components
  • sigmahatvector of estimated model errors
  • interceptintercept
  • coefficientsvector of regression coefficients
  • covarianceif compute.jacobian=TRUE and use.kernel=FALSE, the function returns the covariance matrix of the optimal regression coefficients.
  • m.crashthe number of components for which the algorithm returns negative Degrees of Freedom

Details

to do

References

Akaikie, H. (1973) "Information Theory and an Extension of the Maximum Likelihood Principle". Second International Symposium on Information Theory, 267 - 281. Hansen, M., Yu, B. (2001). "Model Selection and Minimum Descripion Length Principle". Journal of the American Statistical Association, 96, 746 - 774 Kraemer, N., Sugiyama M. (2010). "The Degrees of Freedom of Partial Least Squares Regression". preprint, http://arxiv.org/abs/1002.4112 Kraemer, N., Braun, M.L. (2007) "Kernelizing PLS, Degrees of Freedom, and Efficient Model Selection", Proceedings of the 24th International Conference on Machine Learning, Omni Press, 441 - 448 Schwartz, G. (1979) "Estimating the Dimension of a Model" Annals of Statistics 26(5), 1651 - 1686.

See Also

pls.model, pls.cv

Examples

Run this code
n<-50 # number of observations
p<-5 # number of variables
X<-matrix(rnorm(n*p),ncol=p)
y<-rnorm(n)

# compute linear PLS
pls.object<-pls.ic(X,y,m=ncol(X))

Run the code above in your browser using DataLab