Matrix containing objects to be classified; each row is one d-dimensional object.
ddalpha
DDα-classifier (obtained by ddalpha.train).
outsider.method
Character string, name of a treatment to be used for outsiders; one of those trained by ddalpha.train. If the treatment was specified using the argument outsider.methods then use the name of the method.
use.convex
Logical variable indicating whether outsiders should be determined as the points not contained in any of the convex hulls of the classes from the training sample (TRUE) or those having zero depth w.r.t. each class from the training sample (FALSE). For depth ="zonoid" both values give the same result. If NULL the value specified in DDα-classifier (in ddalpha.train) is used.
Details
Only one outsider treatment can be specified.
See Lange, Mosler and Mozharovskyi (2014) for details and additional information.
Value
List containing class labels, or character string "Ignored" for the outsiders if "Ignore" was specified as the outsider treating method.
References
Mozharovskyi P. (2015), Contributions to Depth-based Classification and Computation of the Tukey Depth. Verlag Dr. Kovac, Hamburg
Dyckerhoff, R., Koshevoy, G. and Mosler, K. (1996), Zonoid data depth: theory and computation. In: Prat A. (ed), COMPSTAT 1996. Proceedings in computational statistics, Physica-Verlag (Heidelberg), 235–240.
Lange, T., Mosler, K. and Mozharovskyi, P. (2014), Fast nonparametric classification based on data depth, Statistical Papers, 55, 49–69.
See Also
ddalpha.train to train the DDα-classifier.
Examples
# Generate a bivariate normal location-shift classification task
# containing 200 training objects and 200 to test with
class1 <- mvrnorm(200, c(0,0),
matrix(c(1,1,1,4), nrow = 2, ncol = 2, byrow = TRUE))
class2 <- mvrnorm(200, c(2,2),
matrix(c(1,1,1,4), nrow = 2, ncol = 2, byrow = TRUE))
trainIndices <- c(1:100)
testIndices <- c(101:200)
propertyVars <- c(1:2)
classVar <- 3
trainData <- rbind(cbind(class1[trainIndices,], rep(1, 100)),
cbind(class2[trainIndices,], rep(2, 100)))
testData <- rbind(cbind(class1[testIndices,], rep(1, 100)),
cbind(class2[testIndices,], rep(2, 100)))
data <- list(train = trainData, test = testData)
# Train the DDalpha-Classifier (zonoid depth, maximum Mahalanobis depth
# classifier with defaults as outsider treatment)
ddalpha <- ddalpha.train(data$train,
depth = "zonoid",
outsider.methods = "depth.Mahalanobis")
# Get the classification error rate
classes <- ddalpha.classify(data$test[,propertyVars], ddalpha,
outsider.method = "depth.Mahalanobis")
cat("Classification error rate: ",
sum(unlist(classes) != data$test[,classVar])/200, ".\n", sep="")