These functions are for cross-validating the SVD of a matrix. They
assume a model $X = U D V' + E$ with the terms being signal
and noise, and try to find the best rank to truncate the SVD of x
at for minimizing prediction error. Here, prediction error is measured
as sum of squares of residuals between the truncated SVD and the
signal part.
For both types of cross-validation, in each replicate we leave out part
of the matrix, fit an SVD approximation to the left-in part, and measure
prediction error on the left-out part.
In Wold-style cross-validation, the holdout set is "speckled", a random set
of elements in the matrix. The missing elements are predicted using
impute.svd
.
In Gabriel-style cross-validation, the holdout set is "blocked". We
permute the rows and columns of the matrix, and leave out the lower-right
block. We use a modified Schur-complement to predict the held-out block.
In Gabriel-style, there are krow*kcol
total folds.