Learn R Programming

poisson.glm.mix (version 1.4)

init1.2.jk.j: 2nd step of Initialization 1 for the \(\beta_{jk}\) (\(m=1\)) or \(\beta_{j}\) (\(m=2\)) parameterization.

Description

This function is the second step of the two-step small initialization procedure (Initialization 1), used for parameterizations \(m=1\) or \(m=2\). At first, init1.1.jk.j is called for each condition \(j=1,\ldots,J\). The values obtained from the first step are used for initializing the second step of the small EM algorithm for fitting the overall mixture \(\sum_{k=1}^{K}\pi_j\prod_{j=1}^{J}\prod_{\ell=1}^{L_j}f(y_{ij\ell})\). The selected values from the second step are the ones that initialize the EM algorithm (bjkmodel or bjmodel), when \(K=K_{min}\).

Usage

init1.2.jk.j(reference, response, L, K, m1, m2, t1, t2, model,mnr)

Value

alpha

numeric array of dimension \(J \times K\) containing the selected values \(\alpha_{jk}^{(0)}\), \(j=1,\ldots,J\), \(k=1,\ldots,K\) that will be used to initialize main EM.

beta

numeric array of dimension \(J \times K \times T\) (if model = 1) or \(J \times T\) (if model = 2) containing the selected values of \(\beta_{jk\tau}^{(0)}\) (or \(\beta_{j\tau}^{(t)}\)), \(j=1,\ldots,J\), \(k=1,\ldots,K\), \(\tau=1,\ldots,T\), that will be used to initialize the main EM.

psim

numeric vector of length \(K\) containing the weights that will initialize the main EM.

ll

numeric, the value of the loglikelihood, computed according to the mylogLikePoisMix function.

Arguments

reference

a numeric array of dimension \(n\times V\) containing the \(V\) covariates for each of the \(n\) observations.

response

a numeric array of count data with dimension \(n\times d\) containing the \(d\) response variables for each of the \(n\) observations.

L

numeric vector of positive integers containing the partition of the \(d\) response variables into \(J\leq d\) blocks, with \(\sum_{j=1}^{J}L_j=d\).

K

positive integer denoting the number of mixture components.

m1

positive integer denoting the number of iterations for each run of init1.1.jk.j.

m2

positive integer denoting the number of iterations for each run of init1.2.jk.j.

t1

positive integer denoting the number of different runs of init1.1.jk.j.

t2

positive integer denoting the number of different runs of init1.2.jk.j.

model

binary variable denoting the parameterization of the model: 1 for \(\beta_{jk}\) and 2 for \(\beta_{j}\) parameterization.

mnr

positive integer denoting the maximum number of Newton-Raphson iterations.

Author

Panagiotis Papastamoulis

See Also

init1.1.jk.j, bjkmodel, bjmodel

Examples

Run this code
############################################################
#1.            Example with beta_jk (m=1) model            #
############################################################
## load a simulated dataset according to the b_jk model
## number of observations: 500
## design: L=(3,2,1)
data("simulated_data_15_components_bjk")
x <- sim.data[,1]
x <- array(x,dim=c(length(x),1))
y <- sim.data[,-1]
## initialize the parameters for a 2 component mixture
## the number of the overall small runs are t2 = 2
## each one consisting of m2 = 2 iterations of the EM.
## the number of the small runs for the first step small EM
## is t1 = 2, each one consisting of m1 = 2 iterations.
start2 <- init1.2.jk.j(reference=x, response=y, L=c(3,2,1), 
                       K=2, m1=2, m2=2, t1=2, t2=2, model=1,mnr = 3)
summary(start2)

############################################################
#2.            Example with beta_j (m=2) model             #
############################################################

## initialize the parameters for a 2 component mixture
## the number of the overall small runs are t2 = 3
## each one consisting of m2 = 2 iterations of the EM.
## the number of the small runs for the first step small EM
## is t1 = 2, each one consisting of m1 = 2 iterations.
start2 <- init1.2.jk.j(reference=x, response=y, L=c(3,2,1), 
                       K=2, m1=2, m2=2, t1=2, t2=3, model=2,mnr = 5)
summary(start2)



Run the code above in your browser using DataLab