In this post are gathered the tutorials and exercises, called “mega-recitations”, to apply the concepts presented in the Artificial Intelligence course of Prof. Patrick Winston. Continue reading “24. Reviews and Exercises”
17. Learning: Boosting
“Wisdom of a weighted crowd of experts”
Classifiers are tests that produce binary choices about samples. They are considered strong classifiers if their error rate is close to 0, weak classifiers if their error rate is close to 0.5.
By using multiple classifiers with different weights, data samples can be sorted or grouped according to different characteristics.
Decision tree stumps
Aside from classifiers, a decision tree can be used to sort positive and negative samples in a 2-dimension space. By adding weights to different tests, some samples can be emphasized over the others. The total sum of weights must always be constrained to 1 to ensure a proper distribution of samples.
Dividing the space
By minimizing the error rate of the tests from the weights, the algorithm can cut the space to sort positive and negative examples.
No over fitting
Boosting algorithms seems not to be over fitting, as the decision tree stumps tends to be very tightly close to outlying samples, only excluding them from the space.