This function gives a side-by-side comparison of (or juxtaposes) the
inefficiency of MCMC algorithms in LaplacesDemon
for
applied use, and is a valuable tool for selecting what is likely to be
the least inefficient algorithm for the user's current model, prior to
updating the final, intended model.
Juxtapose(x)
This is a list of multiple components. Each component must
be an object of class demonoid
.
This function returns an object of class juxtapose
. It is a
\(9 \times J\) matrix with nine results for \(J\) MCMC
algorithms.
Laplace's Demon recommends using the Juxtapose
function on the
user's model (or most likely a simplified version of it) with a
smaller, simulated data set to select the least inefficient MCMC
algorithm before using real data and updating the model for numerous
iterations. The least inefficient MCMC algorithm differs for different
models and data sets. Using Juxtapose
in this way does not
guarantee that the selected algorithm will remain the best choice with
real data, but it should be better than otherwise selecting an
algorithm.
The user must make a decision regarding their model and data. The more
similar the model and data is to the final, intended model and data,
the more appropriate will be the results of the Juxtapose
function. However, if the full model and data are used, then the user
may as well instead skip using Juxtapose
and proceed directly
to LaplacesDemon
. Replacing the actual data set with a
smaller, simulated set is fairly straightforward, but the
decision-making will most likely focus on what is the best way to
reduce the full model specification. A simple approach may be to
merely reduce the number of predictors. However, complicated models
may have several components that slow down estimation time, and extend
the amount of time until global stationarity is estimated. Laplace's
Demon offers no guidance here, and leaves it in the realm of user
discretion.
First, the user should simulate a smaller data set, and if best,
reduce the model specification. Next, the user must select candidate
algorithms. Then, the user must update each algorithm with
LaplacesDemon
for numerous iterations, with the goal of
achieving stationarity for all parameters early in the
iterations. Each update should begin with the same model specification
function, vector of initial values, and data. Each output object of
class demonoid
should be renamed. An example follows.
Suppose a user considers three candidate algorithms for their model:
AMWG, NUTS, and twalk. The user updates each model, saving the model
that used the AMWG algorithm as, say, Fit1
, the NUTS model as
Fit2
, and the twalk model as Fit3
.
Next, the output model objects are put in a list and passed to the
Juxtapose
function. See the example below.
The Juxtapose
function uses an internal version of the
IAT
, which is a slightly modified version of that found
in the SamplerCompare
package. The Juxtapose
function
returns an object of class juxtapose
. It is a matrix in which
each row is a result and each column is an algorithm.
The rows are:
iter.min
: This is the iterations per minute.
t.iter.min
: This is the thinned iterations per minute.
prop.stat
: This is the proportion of iterations that
were stationary.
IAT.025
: This is the 2.5% quantile of the integrated
autocorrelation time of the worst parameter, estimated only on
samples when all parameters are estimated to be globally stationary.
IAT.500
: This is the median integrated autocorrelation
time of the worst parameter, estimated only on samples when all
parameters are estimated to be globally stationary.
IAT.975
: This is the 97.5% quantile of the integrated
autocorrelation time of the worst parameter, estimated only on
samples when all parameters are estimated to be globally stationary.
ISM.025
: This is the 2.5% quantile of the number of
independent samples per minute.
ISM.500
: This is the median of the number of the
independent samples per minute. The least inefficient MCMC algorithm
has the highest ISM.500
.
ISM.975
: This is the 97.5% quantile of the number of
the independent samples per minute.
As for calculating \(ISM\), let \(TIM\) be the observed number of thinned iterations per minute, \(PS\) be the percent of iterations in which all parameters were estimated to be globally stationary, and \(IAT_q\) be a quantile from a simulated distribution of the integrated autocorrelation time among the parameters.
$$ISM = \frac{PS \times TIM}{IAT_q}$$
There are various ways to measure the inefficiency of MCMC
samplers. IAT
is used perhaps most often. As with the
SamplerCompare
package, Laplace's Demon uses the worst
parameter, in terms of IAT
. Often, the number of
evaluations or number of parameters is considered. The
Juxtapose
function, instead considers the final criterion of
MCMC efficiency, in an applied context, to be ISM
, or the
number of Independent (thinned) Samples per Minute. The algorithm with
the highest ISM.500
is the best, or least inefficient,
algorithm with respect to its worst IAT
, the proportion
of iterations required to seem to have global stationarity, and the
number of (thinned) iterations per minute.
A disadvantage of using time is that it will differ by computer, and is less likely to be reported in a journal. The advantage, though, is that it is more meaningful to a user. Increases in the number of evaluations, parameters, and time should all correlate well, but time may enlighten a user as to expected run-time given the model just studied, even though the real data set will most likely be larger than the simulated data used initially. NUTS is an example of a sampler in which the number of evaluations varies per iteration. For an alternative approach, see Thompson (2010).
The Juxtapose
function also adjusts ISM
by
prop.stat
, the proportion of the iterations in which all chains
were estimated to be stationary. This adjustment is weighted by
burn-in iterations, penalizing an algorithm that took longer to
achieve global stationarity. The goal, again, is to assist the user in
selecting the least inefficient MCMC algorithm in an applied setting.
The Juxtapose
function has many other potential uses than those
described above. One additional use of the Juxtapose
function is
to compare inefficiencies within a single algorithm in which
algorithmic specifications varied with different model
updates. Another use is to investigate parallel chains in an object of
class demonoid.hpc
, as returned from the
LaplacesDemon.hpc
function. Yet another use is to
compare the effects of small changes to a model specification
function, such as with priors, or due to an increase in the amount of
simulated data.
An object of class juxtapose
may be plotted with the
plot.juxtapose
function, which displays ISM
by
default, or optionally IAT
. For more information, see the
plot.juxtapose
function.
Independent samples per minute, calculated as ESS
divided by minutes of run-time, are also available by parameter in the
PosteriorChecks
function.
Thompson, M. (2010). "Graphical Comparison of MCMC Performance". ArXiv e-prints, eprint 1011.4458.
IAT
,
is.juxtapose
,
LaplacesDemon
,
LaplacesDemon.hpc
,
plot.juxtapose
, and
PosteriorChecks
.
# NOT RUN {
### Update three demonoid objects, each from different MCMC algorithms.
### Suppose Fit1 was updated with AFSS, Fit2 with AMWG, and
### Fit3 with NUTS. Then, compare the inefficiencies:
#Juxt <- Juxtapose(list(Fit1=Fit1, Fit2=Fit2, Fit3=Fit3)); Juxt
#plot(Juxt, Style="ISM")
# }
Run the code above in your browser using DataLab