The loadings matrix is rotated so the \(k\) rows of the loading matrix
indicated by reference
are the Cholesky factorization given by
t(chol(L[reference,] %*% t(L[reference,])))
.
This defines the
rotation transformation, which is then also applied to other rows to
give the new loadings matrix.
The optimization is not iterative and does not use the GPA algorithm.
The function can be used directly or the
function name can be passed to factor analysis functions like factanal
.
An orthogonal solution is assumed (so \(\Phi\) is identity).
The default uses the first \(k\) rows
as the reference. If the submatrix of L
indicated by reference is
singular then the rotation will fail and the
user needs to supply a different choice of rows.
One use of this parameterization is
for obtaining good starting values (so it may appear strange
to rotate towards this solution afterwards). It has a few other purposes:
(1) It can be useful for comparison with
published results in this parameterization.
(2) The S.E.s are more straightforward to compute, because it is the solution
to an unconstrained optimization (though not necessarily computed as such).
(3) The models with k and (k+1) factors are nested, so it
is more straightforward to test the k-factor model versus the
(k+1)-factor model. In particular, in addition to the LR test
(which does not depend on the rotation), now the Wald test
and LM test can be used as well. For these, the test of a
k-factor model versus a (k+1)-factor model is a
joint test whether all the free parameters (loadings) in the (k+1)st
column of L
are zero.
(4) For some purposes, only the subspace spanned by the factors
is important, not the specific parameterization within this subspace.
(5) The back-predicted indicators (explained portion of the indicators)
do not depend
on the rotation method. Combined with the greater ease to obtain
correct standard errors of this method, this allows easier and more
accurate prediction-standard errors.
(6) This parameterization and its standard errors can be used to
detect identification problems (McDonald, 1999, pp. 181-182).