[PDF][PDF] Improved boosting algorithms using confidence-rated predictions

RE Schapire, Y Singer - Proceedings of the eleventh annual conference …, 1998 - dl.acm.org
RE Schapire, Y Singer
Proceedings of the eleventh annual conference on Computational learning theory, 1998dl.acm.org
We describe several improvements to Freund and Schapire's AdaBoost boosting algorithm,
particularly in a setting in which hypotheses may assign confidences to each of their
predictions. We give a simplified analysis of AdaBoost in this setting, and we show how this
analysis can be used to find improved parameter settings as well as a refined criterion for
training weak hypotheses. We give a specific method for assigning confidences to the
predictions of decision trees, a method closely related to one used by Quinlan. This method …
Abstract
We describe several improvements to Freund and Schapire’s AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each of their predictions. We give a simplified analysis of AdaBoost in this setting, and we show how this analysis can be used to find improved parameter settings as well as a refined criterion for training weak hypotheses. We give a specific method for assigning confidences to the predictions of decision trees, a method closely related to one used by Quinlan. This method also suggests a technique for growing decision trees which turns out to be identical to one proposed by Keams and Mansour.
We focus next on how to apply the new boosting algorithms to multiclass classification problems, paaicularly to the multi-label case in which each example may belong to more than one class. We give two boosting methods for this problem. One of these leads to a new method for handling the single-label case which is simpler but as effective as techniques suggested by Freund and Schapire. Finally, we give some experimental results comparing a few of the algorithms discussed in this paper.
ACM Digital Library