Plot a decision tree obtained by CART.
cartplot(
model,
margin = 0.2,
branch = 0.3,
uniform = TRUE,
fancy = TRUE,
pretty = TRUE,
fwidth = 0,
fheight = 0,
...
)
The decision tree.
an extra fraction of white space to leave around the borders of the tree. (Long labels sometimes get cut off by the default computation).
controls the shape of the branches from parent to child node. Any number from 0 to 1 is allowed. A value of 1 gives square shouldered branches, a value of 0 give V shaped branches, with other values being intermediate.
if TRUE
, uniform vertical spacing of the nodes is used; this may be less cluttered when fitting a large plot onto a page. The default is to use a non-uniform spacing proportional to the error in the fit.
Logical. If TRUE
, nodes are represented by ellipses (interior nodes) and rectangles (leaves) and labeled by yval. The edges connecting the nodes are labeled by left and right splits.
an alternative to the minlength argument, see labels.rpart
.
Relates to option fancy
and the width of the ellipses and rectangles. If fwidth < 1
then it is a scaling factor (default = 0.8). If fwidth > 1
then it represents the number of character widths (for current graphical device) to use.
Relates to option fancy
and the width of the ellipses and rectangles. If fwidth < 1
then it is a scaling factor (default = 0.8). If fwidth > 1
then it represents the number of character heights (for current graphical device) to use.
Other parameters.
CART
, cartdepth
, cartinfo
, cartleafs
, cartnodes
require (datasets)
data (iris)
model = CART (iris [, -5], iris [, 5])
cartplot (model)
Run the code above in your browser using DataLab