lssvm function is an
implementation of the Least Squares SVM. lssvm includes a
reduced version of Least Squares SVM using a decomposition of the
kernel matrix which is calculated by the csi function.
"lssvm"(x, data=NULL, ..., subset, na.action = na.omit, scaled = TRUE)
"lssvm"(x, ...)
"lssvm"(x, y, scaled = TRUE, kernel = "rbfdot", kpar = "automatic", type = NULL, tau = 0.01, reduced = TRUE, tol = 0.0001, rank = floor(dim(x)[1]/3), delta = 40, cross = 0, fit = TRUE, ..., subset, na.action = na.omit)
"lssvm"(x, y, type = NULL, tau = 0.01, tol = 0.0001, rank = floor(dim(x)[1]/3), delta = 40, cross = 0, fit = TRUE, ...)
"lssvm"(x, y, scaled = TRUE, kernel = "stringdot", kpar = list(length=4, lambda = 0.5), type = NULL, tau = 0.01, reduced = TRUE, tol = 0.0001, rank = floor(dim(x)[1]/3), delta = 40, cross = 0, fit = TRUE, ..., subset)kernelMatrix or a list of character vectors.x. Can be either
a factor (for classification tasks) or a numeric vector (for
classification or regression - currently nor supported -).scaled is of length 1, the value is recycled as
many times as needed and all non-binary variables are scaled.
Per default, data are scaled internally to zero mean and unit
variance. The center and scale values are returned and used for later predictions.y is a factor or not, the default
setting for type is "classification" or "regression" respectively,
but can be overwritten by setting an explicit value. (regression is
currently not supported)rbfdot Radial Basis kernel "Gaussian"
polydot Polynomial kernel
vanilladot Linear kernel
tanhdot Hyperbolic tangent kernel
laplacedot Laplacian kernel
besseldot Bessel kernel
anovadot ANOVA RBF kernel
splinedot Spline kernel
stringdot String kernel
Setting the kernel parameter to "matrix" treats x as a kernel
matrix calling the kernelMatrix interface.
The kernel parameter can also be set to a user defined function of
class kernel by passing the function name as an argument.
the list of hyper-parameters (kernel parameters). This is a list which contains the parameters to be used with the kernel function. For valid parameters for existing kernels are :
sigma inverse kernel width for the Radial Basis
kernel function "rbfdot" and the Laplacian kernel "laplacedot".
degree, scale, offset for the Polynomial kernel "polydot"
scale, offset for the Hyperbolic tangent kernel
function "tanhdot"
sigma, order, degree for the Bessel kernel "besseldot".
sigma, degree for the ANOVA kernel "anovadot".
length, lambda, normalized for the "stringdot" kernel
where length is the length of the strings considered, lambda the
decay factor and normalized a logical parameter determining if the
kernel evaluations should be normalized.
Hyper-parameters for user defined kernels can be passed through the kpar parameter as well.
kpar can also be set to the string "automatic" which uses the heuristics in
sigest to calculate a good sigma value for the
Gaussian RBF or Laplace kernel, from the data. (default = "automatic").
FALSE the full linear problem of the
lssvm is solved, when TRUE a reduced method using csi is used.csicsi (default 40)csi
function, lower tolerance leads to more precise approximation but
may increase the training time and the decomposed matrix size (default: 0.0001)NAs are
found. The default action is na.omit, which leads to rejection of cases
with missing values on any required variable. An alternative
is na.fail, which causes an error if NA cases
are found. (NOTE: If given, this argument must be named.)"lssvm" containing the fitted model,
Accessor functions can be used to access the slots of the object (see
examples) which include:
"lssvm"csi function, thus the solution is an approximation
to the exact solution of the lssvm optimization problem. The quality
of the solution depends on the approximation and can be influenced by
the "rank" , "delta", and "tol" parameters.
ksvm, gausspr, csi ## simple example
data(iris)
lir <- lssvm(Species~.,data=iris)
lir
lirr <- lssvm(Species~.,data= iris, reduced = FALSE)
lirr
## Using the kernelMatrix interface
iris <- unique(iris)
rbf <- rbfdot(0.5)
k <- kernelMatrix(rbf, as.matrix(iris[,-5]))
klir <- lssvm(k, iris[, 5])
klir
pre <- predict(klir, k)
Run the code above in your browser using DataLab