Learn R Programming

TMB (version 1.9.15)

checkConsistency: Check consistency and Laplace accuracy

Description

Check consistency of various parts of a TMB implementation. Requires that user has implemented simulation code for the data and optionally random effects. (Beta version; may change without notice)

Usage

checkConsistency(
  obj,
  par = NULL,
  hessian = FALSE,
  estimate = FALSE,
  n = 100,
  observation.name = NULL
)

Value

List with gradient simulations (joint and marginal)

Arguments

obj

Object from MakeADFun

par

Parameter vector (\(\theta\)) for simulation. If unspecified use the best encountered parameter of the object.

hessian

Calculate the hessian matrix for each replicate ?

estimate

Estimate parameters for each replicate ?

n

Number of simulations

observation.name

Optional; Name of simulated observation

Simulation/re-estimation

A full simulation/re-estimation study is performed when estimate=TRUE. By default nlminb will be used to perform the minimization, and output is stored in a separate list component 'estimate' for each replicate. Should a custom optimizer be needed, it can be passed as a user function via the same argument (estimate). The function (estimate) will be called for each simulation as estimate(obj) where obj is the simulated model object. Current default corresponds to estimate = function(obj) nlminb(obj$par,obj$fn,obj$gr).

Details

This function checks that the simulation code of random effects and data is consistent with the implemented negative log-likelihood function. It also checks whether the approximate marginal score function is central indicating whether the Laplace approximation is suitable for parameter estimation.

Denote by \(u\) the random effects, \(\theta\) the parameters and by \(x\) the data. The main assumption is that the user has implemented the joint negative log likelihood \(f_{\theta}(u,x)\) satisfying $$\int \int \exp( -f_{\theta}(u,x) ) \:du\:dx = 1$$ It follows that the joint and marginal score functions are central:

  1. \(E_{u,x}\left[\nabla_{\theta}f_{\theta}(u,x)\right]=0\)

  2. \(E_{x}\left[\nabla_{\theta}-\log\left( \int \exp(-f_{\theta}(u,x))\:du \right) \right]=0\)

For each replicate of \(u\) and \(x\) joint and marginal gradients are calculated. Appropriate centrality tests are carried out by summary.checkConsistency. An asymptotic \(\chi^2\) test is used to verify the first identity. Power of this test increases with the number of simulations n. The second identity holds approximately when replacing the marginal likelihood with its Laplace approximation. A formal test would thus fail eventually for large n. Rather, the gradient bias is transformed to parameter scale (using the estimated information matrix) to provide an estimate of parameter bias caused by the Laplace approximation.

See Also

summary.checkConsistency, print.checkConsistency