Hamiltonian Monte-Carlo, also called Hybrid Monte Carlo, is a sampling algorithm that uses Hamiltonian Dynamics to approximate a posterior distribution. Unlike MH and MC3, HMC uses not only the current position, but also a sense of momentum, to draw future samples. An introduction to HMC can be read in betancourt2018ConceptualIntroductionHamiltonian;textualsamplr.
sampler_hmc(
start,
distr_name = NULL,
distr_params = NULL,
epsilon = 0.5,
L = 10,
iterations = 1024,
weights = NULL,
custom_density = NULL
)
A named list containing
Samples: the history of visited places (an n x d matrix, n = iterations; d = dimensions)
Momentums: the history of momentum values (an n x d matrix, n = iterations; d = dimensions). Nothing is proposed in the first iteration (the first iteration is the start value) and so the first row is NA
Acceptance Ratio: The proportion of proposals that were accepted.
Vector. Starting position of the sampler.
Name of the distribution from which to sample from.
Distribution parameters.
Size of the leapfrog step
Number of leapfrog steps per iteration
Number of iterations of the sampler.
If using a mixture distribution, the weights given to each constituent distribution. If none given, it defaults to equal weights for all distributions.
Instead of providing names, params and weights, the user may prefer to provide a custom density function.
This implementations assumes that the momentum is drawn from a normal distribution with mean 0 and identity covariance matrix (p ~ N (0, I)). Hamiltonian Monte Carlo does not support discrete distributions.
This algorithm has been used to model human data in aitchison2016HamiltonianBrainEfficient;textualsamplr, castillo2024ExplainingFlawsHuman;textualsamplr and zhu2022UnderstandingStructureCognitive;textualsamplr among others.
result <- sampler_hmc(
distr_name = "norm", distr_params = c(0,1),
start = 1, epsilon = .01, L = 100
)
cold_chain <- result$Samples
Run the code above in your browser using DataLab