Demonstrate the process of k-Nearest Neighbour classification on the 2D plane.
knn.ani(
train,
test,
cl,
k = 10,
interact = FALSE,
tt.col = c("blue", "red"),
cl.pch = seq_along(unique(cl)),
dist.lty = 2,
dist.col = "gray",
knn.col = "green",
...
)
matrix or data frame of training set cases containing only 2 columns
matrix or data frame of test set cases. A vector will be
interpreted as a row vector for a single case. It should also contain only
2 columns. This data set will be ignored if interact = TRUE
;
see interact
below.
factor of true classifications of training set
number of neighbours considered.
logical. If TRUE
, the user will have to choose a test
set for himself using mouse click on the screen; otherwise compute kNN
classification based on argument test
.
a vector of length 2 specifying the colors for the training data and test data.
a vector specifying symbols for each class
the line type and color to annotate the distances
the color to annotate the k-nearest neighbour points using a polygon
additional arguments to create the empty frame for the animation
(passed to plot.default
)
A vector of class labels for the test set.
For each row of the test set, the \(k\) nearest (in Euclidean distance) training set vectors are found, and the classification is decided by majority vote, with ties broken at random. For a single test sample point, the basic steps are:
locate the test point
compute the distances between the test point and all points in the training set
find \(k\) shortest distances and the corresponding training set points
vote for the result (find the maximum in the table for the true classifications)
As there are four steps in an iteration, the total number of animation frames
should be 4 * min(nrow(test), ani.options('nmax'))
at last.
Examples at https://yihui.org/animation/example/knn-ani/
Venables, W. N. and Ripley, B. D. (2002) Modern Applied Statistics with S. Fourth edition. Springer.