arima(x, order = c(0L, 0L, 0L),
seasonal = list(order = c(0L, 0L, 0L), period = NA),
xreg = NULL, include.mean = TRUE,
transform.pars = TRUE,
fixed = NULL, init = NULL,
method = c("CSS-ML", "ML", "CSS"), n.cond,
SSinit = c("Gardner1980", "Rossignol2011"),
optim.method = "BFGS",
optim.control = list(), kappa = 1e6)
frequency(x)
).
This should be a list with components order
and
period
, but a specification of just a numeric vector of
length 3 will be turned into a suitable list with the specification
as the order
.x
.TRUE
for undifferenced series, and it is ignored
for ARIMA models with differencing.method = "CSS"
. For
method = "ML"
, it has been advantageous to set
transform.pars = FALSE
in some cases, see also fixed
.NA
entries in
fixed
will be varied. transform.pars = TRUE
will be overridden (with a warning) if any AR parameters are fixed.
It may be wise to set transform.pars = FALSE
when fixing
MA parameters, especially near non-invertibility.fixed
will be ignored.KalmanLike
for details. Can be abbreviated.method
argument to
optim
.optim
."Arima"
with components:coef
method.coef
, which can be extracted by the vcov
method.method = "ML"
fits.x
.optim
.nobs()
and is used by
BIC
.KalmanLike
.kappa
). Observations which are still
controlled by the diffuse prior (determined by having a Kalman gain of
at least 1e4
) are excluded from the likelihood calculations.
(This gives comparable results to arima0
in the absence
of missing values, when the observations excluded are precisely those
dropped by the differencing.) Missing values are allowed, and are handled exactly in method "ML"
.
If transform.pars
is true, the optimization is done using an
alternative parametrization which is a variation on that suggested by
Jones (1980) and ensures that the model is stationary. For an AR(p)
model the parametrization is via the inverse tanh of the partial
autocorrelations: the same procedure is applied (separately) to the
AR and seasonal AR terms. The MA terms are not constrained to be
invertible during optimization, but they will be converted to
invertible form after optimization if transform.pars
is true.
Conditional sum-of-squares is provided mainly for expositional
purposes. This computes the sum of squares of the fitted innovations
from observation n.cond
on, (where n.cond
is at least
the maximum lag of an AR term), treating all earlier innovations to
be zero. Argument n.cond
can be used to allow comparability
between different fits. The
When regressors are specified, they are orthogonalized prior to fitting unless any of the coefficients is fixed. It can be helpful to roughly scale the regressors to zero mean and unit variance.
$$X_t= a_1X_{t-1}+ \cdots+ a_pX_{t-p} + e_t + b_1e_{t-1}+\cdots+ b_qe_{t-q}$$
and so the MA coefficients differ in sign from those of S-PLUS.
Further, if include.mean
is true (the default for an ARMA
model), this formula applies to $X - m$ rather than $X$. For
ARIMA models with differencing, the differenced series follows a
zero-mean ARMA model. If am xreg
term is included, a linear
regression (with a constant term if include.mean
is true and
there is no differencing) is fitted with an ARMA model for the error
term.
The variance matrix of the estimates is found from the Hessian of the log-likelihood, and so may only be a rough guide.
Optimization is done by optim
. It will work
best if the columns in xreg
are roughly scaled to zero mean
and unit variance, but does attempt to estimate suitable scalings.
Durbin, J. and Koopman, S. J. (2001) Time Series Analysis by State Space Methods. Oxford University Press.
Gardner, G, Harvey, A. C. and Phillips, G. D. A. (1980) Algorithm AS154. An algorithm for exact maximum likelihood estimation of autoregressive-moving average models by means of Kalman filtering. Applied Statistics 29, 311--322.
Harvey, A. C. (1993) Time Series Models, 2nd Edition, Harvester Wheatsheaf, sections 3.3 and 4.4.
Jones, R. H. (1980) Maximum likelihood fitting of ARMA models to time series with missing observations. Technometrics 22 389--395.
Ripley, B. D. (2002) Time series in R1.5.0.
R News, 2/2, 2--7.
predict.Arima
, arima.sim
for simulating
from an ARIMA model, tsdiag
, arima0
,
ar
arima(lh, order = c(1,0,0))
arima(lh, order = c(3,0,0))
arima(lh, order = c(1,0,1))
arima(lh, order = c(3,0,0), method = "CSS")
arima(USAccDeaths, order = c(0,1,1), seasonal = list(order = c(0,1,1)))
arima(USAccDeaths, order = c(0,1,1), seasonal = list(order = c(0,1,1)),
method = "CSS") # drops first 13 observations.
# for a model with as few years as this, we want full ML
arima(LakeHuron, order = c(2,0,0), xreg = time(LakeHuron) - 1920)
## presidents contains NAs
## graphs in example(acf) suggest order 1 or 3
require(graphics)
(fit1 <- arima(presidents, c(1, 0, 0)))
nobs(fit1)
tsdiag(fit1)
(fit3 <- arima(presidents, c(3, 0, 0))) # smaller AIC
tsdiag(fit3)
BIC(fit1, fit3)
## compare a whole set of models; BIC() would choose the smallest
AIC(fit1, arima(presidents, c(2,0,0)),
arima(presidents, c(2,0,1)), # <- chosen (barely) by AIC
fit3, arima(presidents, c(3,0,1)))
## An example of ARIMA forecasting:
predict(fit3, 3)
Run the code above in your browser using DataLab