Learn R Programming

ks (version 1.3.2)

kde: Kernel density estimate for multivariate data

Description

Kernel density estimate for 2- to 6-dimensional data

Usage

kde(x, H, gridsize, supp=3.7, eval.points, eval.levels)

Arguments

x
matrix of data values
H
bandwidth matrix
gridsize
vector of number of grid points
supp
effective support for standard normal is [-supp, supp]
eval.points
points that density estimate is evaluated at (required for dimensions > 3)
eval.levels
levels at which to draw the level surfaces for 3-dimensiona data

Value

  • Kernel density estimate is an object of class kde which is a list with 4 fields
  • xdata points - same as input
  • eval.pointspoints that density estimate is evaluated at
  • estimatedensity estimate at eval.points
  • Hbandwidth matrix

Details

The kernel density estimate is computed exactly i.e. binning is not used. If gridsize is not set to a specific value, then it defaults to 50 grid points in each co-ordinate direction i.e. rep(50, d). Not required to be set if specifying eval.points.

If eval.points is not specified, then the density estimate is automatically computed over a grid whose resolution is controlled by gridsize (a grid is required for plotting).

References

Wand, M.P. & Jones, M.C. (1995) Kernel Smoothing. Chapman & Hall. London.

See Also

plot.kde

Examples

Run this code
### bivariate example
data(unicef)
H.pi <- Hpi(unicef, nstage=1)
fhat <- kde(unicef, H.pi)

### trivariate example
mus <- rbind(c(0,0,0), c(2,2,2))
Sigma <- matrix(c(1, 0.7, 0.7, 0.7, 1, 0.7, 0.7, 0.7, 1), nr=3, nc=3) 
Sigmas <- rbind(Sigma, Sigma)
props <- c(1/2, 1/2)
x <- rmvnorm.mixt(n=100, mus=mus, Sigmas=Sigmas, props=props)
H.pi <- Hpi(x)
fhat <- kde(x, H.pi, eval.levels=seq(-3,3, length=9)) 

### 4-variate example
library(MASS)
data(iris)
ir <- iris[,1:4][iris[,5]=="setosa",]
H.scv <- Hscv(ir)
fhat <- kde(ir, H.scv, eval.points=ir)

Run the code above in your browser using DataLab