## S3 method for class 'ppp':
density(x, sigma, \dots,
weights, edge=TRUE, varcov=NULL,
at="pixels", leaveoneout=TRUE,
adjust=1, diggle=FALSE)
"ppp"
).sigma
.as.mask
to determine
the pixel resolution.TRUE
, apply edge correction.sigma
.at="pixels"
) or
only at the points of x
(at="points"
).at="points"
.TRUE
, use Diggle's edge correction,
which is more accurate but slower to compute than the
correction described under Details."im"
).
Pixel values are estimated intensity values,
expressed in If at="points"
, the result is a numeric vector
of length equal to the number of points in x
.
Values are estimated intensity values at the points of x
.
In either case, the return value has attributes
"sigma"
and "varcov"
which report the smoothing
bandwidth that was used.
density
. It computes a fixed-bandwidth kernel estimate
(Diggle, 1985) of the intensity function of the point process
that generated the point pattern x
.
By default it computes the convolution of the
isotropic Gaussian kernel of standard deviation sigma
with point masses at each of the data points in x
.
Anisotropic Gaussian kernels are also supported.
Each point has unit weight, unless the argument weights
is
given (it should be a numeric vector; weights can be negative or zero).
If edge=TRUE
, the intensity estimate is corrected for
edge effect bias in one of two ways:
diggle=FALSE
(the default) the intensity estimate is
correted by dividing it by the convolution of the
Gaussian kernel with the window of observation.
Thus the intensity value at a point$u$is$$\hat\lambda(u) = e(u) \sum_i k(x_i - u) w_i$$where$k$is the Gaussian smoothing kernel,$e(u)$is an edge correction factor,
and$w_i$are the weights.diggle=TRUE
then the method of Diggle (1985)
is followed exactly.
The intensity value at a point$u$is$$\hat\lambda(u) = \sum_i k(x_i - u) w_i e(x_i)$$where again$k$is the Gaussian smoothing kernel,$e(x_i)$is an edge correction factor,
and$w_i$are the weights.
This computation is slightly slower but more accurate. The smoothing kernel is determined by the arguments
sigma
, varcov
and adjust
.
sigma
is a single numerical value,
this is taken as the standard deviation of the isotropic Gaussian
kernel.sigma
may be a function that computes
an appropriate bandwidth for the isotropic Gaussian kernel
from the data point pattern by callingsigma(x)
.
To perform automatic bandwidth selection using cross-validation,
it is recommended to use the functionbw.diggle
.varcov
.
The argumentssigma
andvarcov
are incompatible.sigma
may be a vector of length 2 giving the
standard deviations of two independent Gaussian coordinates,
thus equivalent tovarcov = diag(rep(sigma^2, 2))
.sigma
norvarcov
is specified,
an isotropic Gaussian kernel will be used,
with a default value ofsigma
calculated by a simple rule of thumb
that depends only on the size of the window.adjust
makes it easy for the user to change the
bandwidth specified by any of the rules above.
The value ofsigma
will be multiplied by
the factoradjust
. The matrixvarcov
will be
multiplied byadjust^2
. To double the smoothing bandwidth, setadjust=2
....
passed to as.mask
. If at="points"
, the intensity values are computed
to high accuracy at the points of x
only. Computation is
performed by directly evaluating and summing the Gaussian kernel
contributions without discretising the data. The result is a numeric
vector giving the density values.
The intensity value at a point $x_i$ is (if diggle=FALSE
)
$$\hat\lambda(x_i) = e(x_i) \sum_j k(x_j - x_i) w_j$$
or (if diggle=TRUE
)
$$\hat\lambda(x_i) = \sum_j k(x_j - x_i) w_j e(x_j)$$
If leaveoneout=TRUE
(the default), then the sum in the equation
is taken over all $j$ not equal to $i$,
so that the intensity value at a
data point is the sum of kernel contributions from
all other data points.
If leaveoneout=FALSE
then the sum is taken over all $j$,
so that the intensity value at a data point includes a contribution
from the same point.
To select the bandwidth sigma
automatically by
cross-validation, use bw.diggle
.
To perform spatial interpolation of values that were observed
at the points of a point pattern, use smooth.ppp
.
For adaptive nonparametric estimation, see
adaptive.density
.
For data sharpening, see sharpen.ppp
.
To compute a relative risk surface or probability map for
two (or more) types of points, use relrisk
.
www.csiro.au/resources/pf16h.html
Diggle, P.J. (1985) A kernel method for smoothing point process data. Applied Statistics (Journal of the Royal Statistical Society, Series C) 34 (1985) 138--147.
Diggle, P.J. (2003) Statistical analysis of spatial point patterns, Second edition. Arnold.
bw.diggle
,
smooth.ppp
,
sharpen.ppp
,
adaptive.density
,
relrisk
,
ppp.object
,
im.object
data(cells)
if(interactive()) {
opa <- par(mfrow=c(1,2))
plot(density(cells, 0.05))
plot(density(cells, 0.05, diggle=TRUE))
par(opa)
v <- diag(c(0.05, 0.07)^2)
plot(density(cells, varcov=v))
}
<testonly>Z <- density(cells, 0.05)
Z <- density(cells, 0.05, diggle=TRUE)
Z <- density(cells, varcov=diag(c(0.05^2, 0.07^2)))</testonly>
# automatic bandwidth selection
plot(density(cells, sigma=bw.diggle(cells)))
# equivalent:
plot(density(cells, bw.diggle))
# evaluate intensity at points
density(cells, 0.05, at="points")
Run the code above in your browser using DataLab