Last data update: 2014.03.03
|
R: Cross-Validation for Multi-class Boosting
Cross-Validation for Multi-class Boosting
Description
Cross-validated estimation of the empirical multi-class loss
for boosting parameter selection.
Usage
cv.mbst(x, y, balance=FALSE, K = 10, cost = NULL,
family = c("hinge","hinge2","thingeDC"),
learner = c("tree", "ls", "sm"), ctrl = bst_control(),
type = c("loss","error"), plot.it = TRUE, se = TRUE, n.cores=2, ...)
Arguments
x |
a data frame containing the variables in the model.
|
y |
vector of responses. y must be integers from 1 to C for C class problem.
|
balance |
logical value. If TRUE, The K
parts were roughly balanced, ensuring that the classes were distributed
proportionally among each of the K parts.
|
K |
K-fold cross-validation
|
cost |
price to pay for false positive, 0 < cost < 1; price of false negative is 1-cost .
|
family |
family = "hinge" for hinge loss. "hinge2" is a different hinge loss
|
learner |
a character specifying the component-wise base learner to be used:
ls linear models,
sm smoothing splines,
tree regression trees.
|
ctrl |
an object of class bst_control .
|
type |
for family="hinge" , type="loss" is hinge risk. For family="thingeDC" , type="loss"
|
plot.it |
a logical value, to plot the estimated risks if TRUE .
|
se |
a logical value, to plot with standard errors.
|
n.cores |
The number of CPU cores to use. The cross-validation loop
will attempt to send different CV folds off to different cores.
|
... |
additional arguments.
|
Value
object with
residmat |
empirical risks in each cross-validation at boosting iterations
|
fraction |
abscissa values at which CV curve should be computed.
|
cv |
The CV curve at each value of fraction
|
cv.error |
The standard error of the CV curve
|
...
See Also
mbst
Results
|