Cross Validation by leave-one-out for a gp
object.
# S3 method for gp
influence(model, type = "UK", trend.reestim = TRUE, ...)
A list composed of the following elements, where n is the total number of observations.
Vector of length n. The \(i\)-th element is the kriging mean (including the trend) at the \(i\)th observation number when removing it from the learning set.
Vector of length n. The \(i\)-th element is the kriging standard deviation at the \(i\)-th observation number when removing it from the learning set.
An object of class "gp"
.
Character string corresponding to the GP "kriging" family, to be
chosen between simple kriging ("SK"
), or universal kriging
("UK"
).
Should the trend be re-estimated when removing an observation?
Default to TRUE
.
Not used.
Only trend parameters are re-estimated when removing one observation. When the number \(n\) of observations is small, the re-estimated values can be far away from those obtained with the entire learning set.
O. Roustant, D. Ginsbourger.
Leave-one-out (LOO) consists in computing the prediction at a design point when the corresponding observation is removed from the learning set (and this, for all design points). A quick version of LOO based on Dubrule's formula is also implemented; It is limited to 2 cases:
(type == "SK") & !trend.reestim
and
(type == "UK") & trend.reestim
.
F. Bachoc (2013), "Cross Validation and Maximum Likelihood estimations of hyper-parameters of Gaussian processes with model misspecification". Computational Statistics and Data Analysis, 66, 55-69 link
N.A.C. Cressie (1993), Statistics for spatial data. Wiley series in probability and mathematical statistics.
O. Dubrule (1983), "Cross validation of Kriging in a unique neighborhood". Mathematical Geology, 15, 687-699.
J.D. Martin and T.W. Simpson (2005), "Use of kriging models to approximate deterministic computer models". AIAA Journal, 43 no. 4, 853-863.
M. Schonlau (1997), Computer experiments and global optimization. Ph.D. thesis, University of Waterloo.
predict.gp
, plot.gp