Learn R Programming

LaplacesDemon (version 16.1.0)

dist.Normal.Precision: Normal Distribution: Precision Parameterization

Description

These functions provide the density, distribution function, quantile function, and random generation for the univariate normal distribution with mean \(\mu\) and precision \(\tau\).

Usage

dnormp(x, mean=0, prec=1, log=FALSE)
pnormp(q, mean=0, prec=1, lower.tail=TRUE, log.p=FALSE)
qnormp(p, mean=0, prec=1, lower.tail=TRUE, log.p=FALSE)
rnormp(n, mean=0, prec=1)

Arguments

x, q

These are each a vector of quantiles.

p

This is a vector of probabilities.

n

This is the number of observations, which must be a positive integer that has length 1.

mean

This is the mean parameter \(\mu\).

prec

This is the precision parameter \(\tau\), which must be positive.

log, log.p

Logical. If TRUE, then probabilities \(p\) are given as \(\log(p)\).

lower.tail

Logical. If TRUE (default), then probabilities are \(Pr[X \le x]\), otherwise, \(Pr[X > x]\).

Value

dnormp gives the density, pnormp gives the distribution function, qnormp gives the quantile function, and rnormp generates random deviates.

Details

  • Application: Continuous Univariate

  • Density: \(p(\theta) = \sqrt{\frac{\tau}{2\pi}} \exp(-\frac{\tau}{2} (\theta-\mu)^2)\)

  • Inventor: Carl Friedrich Gauss or Abraham De Moivre

  • Notation 1: \(\theta \sim \mathcal{N}(\mu, \tau^{-1})\)

  • Notation 2: \(p(\theta) = \mathcal{N}(\theta | \mu, \tau^{-1})\)

  • Parameter 1: mean parameter \(\mu\)

  • Parameter 2: precision parameter \(\tau > 0\)

  • Mean: \(E(\theta) = \mu\)

  • Variance: \(var(\theta) = \tau^{-1}\)

  • Mode: \(mode(\theta) = \mu\)

The normal distribution, also called the Gaussian distribution and the Second Law of Laplace, is usually parameterized with mean and variance, or in Bayesian inference, with mean and precision, where precision is the inverse of the variance. In contrast, Base R parameterizes the normal distribution with the mean and standard deviation. These functions provide the precision parameterization for convenience and familiarity.

Some authors attribute credit for the normal distribution to Abraham de Moivre in 1738. In 1809, Carl Friedrich Gauss published his monograph ``Theoria motus corporum coelestium in sectionibus conicis solem ambientium'', in which he introduced the method of least squares, method of maximum likelihood, and normal distribution, among many other innovations.

Gauss, himself, characterized this distribution according to mean and precision, though his definition of precision differed from the modern one. The modern Bayesian use of precision \(\tau\) developed because it was more straightforward to estimate \(\tau\) with a gamma distribution as a conjugate prior, than to estimate \(\sigma^2\) with an inverse-gamma distribution as a conjugate prior.

Although the normal distribution is very common, it often does not fit data as well as more robust alternatives with fatter tails, such as the Laplace or Student t distribution.

A flat distribution is obtained in the limit as \(\tau \rightarrow 0\).

For models where the dependent variable, y, is specified to be normally distributed given the model, the Jarque-Bera test (see plot.demonoid.ppc or plot.laplace.ppc) may be used to test the residuals.

These functions are similar to those in base R.

See Also

dlaplace, dnorm, dnormv, prec2var, dst, dt, plot.demonoid.ppc, and plot.laplace.ppc.

Examples

Run this code
# NOT RUN {
library(LaplacesDemon)
x <- dnormp(1,0,1)
x <- pnormp(1,0,1)
x <- qnormp(0.5,0,1)
x <- rnormp(100,0,1)

#Plot Probability Functions
x <- seq(from=-5, to=5, by=0.1)
plot(x, dnormp(x,0,0.5), ylim=c(0,1), type="l", main="Probability Function",
     ylab="density", col="red")
lines(x, dnormp(x,0,1), type="l", col="green")
lines(x, dnormp(x,0,5), type="l", col="blue")
legend(2, 0.9, expression(paste(mu==0, ", ", tau==0.5),
     paste(mu==0, ", ", tau==1), paste(mu==0, ", ", tau==5)),
     lty=c(1,1,1), col=c("red","green","blue"))
# }

Run the code above in your browser using DataLab