This function estimates the random-access memory (RAM) required to
update a given model and data with PMC
.
Warning: Unwise use of this function may crash a computer, so please read the details below.
PMC.RAM(Model, Data, Iterations, Thinning, M, N)
This is a model specification function. For more
information, see PMC
.
This is a list of Data. For more information, see
PMC
.
This is the number of mixture components in
PMC
.
This is the number of samples in PMC
.
PMC.RAM
returns a list with several components. Each component
is an estimate in MB for an object. The list has the following
components:
This is the estimated size in MB of RAM required for the matrix of mixture probabilities by iteration.
This is the estimated size in MB of RAM required for the covariance matrix or matrices.
This is the estimated size in MB of RAM required for the list of data.
This is the estimated size in MB of RAM required for the deviance vector before thinning.
This is the estimated size in MB of RAM required for the matrix or vector of initial values.
This is the estimated size in MB of RAM required for the
\(N \times T \times M\) array LH
, where \(N\)
is the number of samples, \(T\) is the number of iterations, and
\(M\) is the number of mixture components. The LH
array is
not returned by PMC
.
This is the estimated size in MB of RAM required for the
\(N \times T \times M\) array LP
, where \(N\)
is the number of samples, \(T\) is the number of iterations, and
\(M\) is the number of mixture components. The LP
array is
not returned by PMC
.
This is the estimated size in MB of RAM required for the model specification function.
This is the estimated size in MB of RAM required for
the \(N \times J\) matrix Monitor
, where \(N\) is
the number of unthinned samples and J is the number of monitored
variables. Although it is thinned later in the algorithm, the full
matrix is created.
This is the estimated size in MB of RAM required for
the \(N \times J \times T \times M\) array
Posterior1
, where \(N\) is the number of samples, \(J\)
is the number of parameters, \(T\) is the number of iterations,
and \(M\) is the number of mixture components.
This is the estimated size in MB of RAM required for
the \(N \times J\) matrix Posterior2
, where \(N\)
is the number of samples and \(J\) is the number of initial values
or parameters. Although this is thinned later, at one point it is
un-thinned.
This is the estimated size in MB of RAM required for the summary table.
This is the estimated size in MB of RAM required for the matrix of importance weights.
This is the estimated size in MB of RAM required in total
to update with PMC
for a given model and data, and for
a number of iterations, specified thinning, mixture components, and
number of samples.
The PMC.RAM
function uses the
object.size
function to estimate the size in MB of RAM
required to update in PMC
for a given model and data,
and for a number of iterations and specified thinning. When RAM is
exceeded, the computer will crash. This function can be useful when
trying to estimate how many samples and iterations to update a
model without crashing the computer. However, when estimating the
required RAM, PMC.RAM
actually creates several large
objects, such as post
(see below). If too many iterations are
given as an argument to PMC.RAM
, for example, then it
will crash the computer while trying to estimate the required RAM.
The best way to use this function is as follows. First, prepare the model specification and list of data. Second, observe how much RAM the computer is using at the moment, as well as the maximum available RAM. The majority of the difference of these two is the amount of RAM the computer may dedicate to updating the model. Next, use this function with a small number of iterations. Note the estimated RAM. Increase the number of iterations, and again note the RAM. Continue to increase the number of iterations until, say, arbitrarily within 90% of the above-mentioned difference in RAM.
The computer operating system uses RAM, as does any other software
running at the moment. R is currently using RAM, and other functions
in the LaplacesDemon
package, and any other package that is
currently activated, are using RAM. There are numerous small objects
that are not included in the returned list, that use RAM. For example,
perplexity is a small vector, etc.
A potentially large objects that is not included is a matrix used for
estimating LML
.
BigData
,
LML
,
object.size
, and
PMC
.