Perform Gibbs sampling simulation for a Bayesian mixture of Plackett-Luce models fitted to partial orderings.
gibbsPLMIX(pi_inv, K, G, init = list(z = NULL, p = NULL),
n_iter = 1000, n_burn = 500, hyper = list(shape0 = matrix(1, nrow =
G, ncol = K), rate0 = rep(0.001, G), alpha0 = rep(1, G)),
centered_start = FALSE)
An object of class top_ordering
, collecting the numeric \(N\)\(\times\)\(K\) data matrix of partial orderings, or an object that can be coerced with as.top_ordering
.
Number of possible items.
Number of mixture components.
List of named objects with initialization values: z
is a numeric \(N\)\(\times\)\(G\) matrix of binary mixture component memberships; p
is a numeric \(G\)\(\times\)\(K\) matrix of component-specific support parameters. If starting values are not supplied (NULL
), they are randomly generated with a uniform distribution. Default is NULL
.
Total number of MCMC iterations.
Number of initial burn-in drawings removed from the returned MCMC sample.
List of named objects with hyperparameter values for the conjugate prior specification: shape0
is a numeric \(G\)\(\times\)\(K\) matrix of shape hyperparameters; rate0
is a numeric vector of \(G\) rate hyperparameters; alpha0
is a numeric vector of \(G\) Dirichlet hyperparameters. Default is vague prior setting.
Logical: whether a random start whose support parameters and weights should be centered around the observed relative frequency that each item has been ranked top. Default is FALSE
. Ignored when init
is not NULL
.
A list of S3 class gsPLMIX
with named elements:
W
Numeric \(L\)\(\times\)\(G\) matrix with MCMC samples of the mixture weights.
P
Numeric \(L\)\(\times\)\((G*K)\) matrix with MCMC samples of the component-specific support parameters.
log_lik
Numeric vector of \(L\) posterior log-likelihood values.
deviance
Numeric vector of \(L\) posterior deviance values (\(-2 * \)log_lik
).
objective
Numeric vector of \(L\) objective function values (that is the kernel of the log-posterior distribution).
call
The matched call.
The size \(L\) of the final MCMC sample is equal to n_iter
-n_burn
.
Mollica, C. and Tardella, L. (2017). Bayesian Plackett-Luce mixture models for partially ranked data. Psychometrika, 82(2), pages 442--458, ISSN: 0033-3123, DOI: 10.1007/s11336-016-9530-0.
# NOT RUN {
data(d_carconf)
GIBBS <- gibbsPLMIX(pi_inv=d_carconf, K=ncol(d_carconf), G=3, n_iter=30, n_burn=10)
str(GIBBS)
GIBBS$P
GIBBS$W
# }
Run the code above in your browser using DataLab