Fuses a base learner with a multi-class method. Creates a learner object, which can be used like any other learner object. This way learners which can only handle binary classification will be able to handle multi-class problems, too.
We use a multiclass-to-binary reduction principle, where multiple binary problems are created from the multiclass task. How these binary problems are generated is defined by an error-correcting-output-code (ECOC) code book. This also allows the simple and well-known one-vs-one and one-vs-rest approaches. Decoding is currently done via Hamming decoding, see e.g. here http://jmlr.org/papers/volume11/escalera10a/escalera10a.pdf.
Currently, the approach always operates on the discrete predicted labels of the binary base models (instead of their probabilities) and the created wrapper cannot predict posterior probabilities.
makeMulticlassWrapper(learner, mcw.method = "onevsrest")
(Learner | character(1)
)
The learner.
If you pass a string the learner will be created via makeLearner.
(character(1)
| function
)
“onevsone” or “onevsrest”.
You can also pass a function, with signature function(task)
and which
returns a ECOC codematrix with entries +1,-1,0.
Columns define new binary problems, rows correspond to classes (rows must be named).
0 means class is not included in binary problem.
Default is “onevsrest”.
Other wrapper: makeBaggingWrapper
,
makeClassificationViaRegressionWrapper
,
makeConstantClassWrapper
,
makeCostSensClassifWrapper
,
makeCostSensRegrWrapper
,
makeDownsampleWrapper
,
makeDummyFeaturesWrapper
,
makeExtractFDAFeatsWrapper
,
makeFeatSelWrapper
,
makeFilterWrapper
,
makeImputeWrapper
,
makeMultilabelBinaryRelevanceWrapper
,
makeMultilabelClassifierChainsWrapper
,
makeMultilabelDBRWrapper
,
makeMultilabelNestedStackingWrapper
,
makeMultilabelStackingWrapper
,
makeOverBaggingWrapper
,
makePreprocWrapperCaret
,
makePreprocWrapper
,
makeRemoveConstantFeaturesWrapper
,
makeSMOTEWrapper
,
makeTuneWrapper
,
makeUndersampleWrapper
,
makeWeightedClassesWrapper