Expectation Maximization applied to the linear discriminant classifier assuming Gaussian classes with a shared covariance matrix.
EMLinearDiscriminantClassifier(X, y, X_u, method = "EM", scale = FALSE,
eps = 1e-08, verbose = FALSE, max_iter = 100)
matrix; Design matrix for labeled data
factor or integer vector; Label vector
matrix; Design matrix for unlabeled data
character; Currently only "EM"
logical; Should the features be normalized? (default: FALSE)
Stopping criterion for the maximinimization
logical; Controls the verbosity of the output
integer; Maximum number of iterations
Starting from the supervised solution, uses the Expectation Maximization algorithm (see Dempster et al. (1977)) to iteratively update the means and shared covariance of the classes (Maximization step) and updates the responsibilities for the unlabeled objects (Expectation step).
Dempster, A., Laird, N. & Rubin, D., 1977. Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society. Series B, 39(1), pp.1-38.
Other RSSL classifiers:
EMLeastSquaresClassifier
,
GRFClassifier
,
ICLeastSquaresClassifier
,
ICLinearDiscriminantClassifier
,
KernelLeastSquaresClassifier
,
LaplacianKernelLeastSquaresClassifier()
,
LaplacianSVM
,
LeastSquaresClassifier
,
LinearDiscriminantClassifier
,
LinearSVM
,
LinearTSVM()
,
LogisticLossClassifier
,
LogisticRegression
,
MCLinearDiscriminantClassifier
,
MCNearestMeanClassifier
,
MCPLDA
,
MajorityClassClassifier
,
NearestMeanClassifier
,
QuadraticDiscriminantClassifier
,
S4VM
,
SVM
,
SelfLearning
,
TSVM
,
USMLeastSquaresClassifier
,
WellSVM
,
svmlin()