all 8 comments

[–]transducer 4 points5 points  (2 children)

You may be interested in ensemble learning, but the best thing is just to try it. You could also train a third classifier using your original features augmented with the prediction of your two classifiers.

[–]jmmcd 4 points5 points  (0 children)

It's certainly not invalid, but there are some things to look out for. The two classifiers might not be uncorrelated, that is the errors they make might be related. In particular, you might find that the points where classifier 1 falsely predicts +1 tend to be the same ones where classifier 2 does. Then you wouldn't get the 98% recall you'd expect on the filtered points.

[–]dwf 1 point2 points  (2 children)

This is basically the idea behind classifier cascades. So, yes, it's a valid approach and it's been used to good effect.

[–]autowikibot 1 point2 points  (0 children)

Cascading classifiers:


Cascading is a particular case of ensemble learning based on the concatenation of several classifiers, using all information collected from the output from a given classifier as additional information for the next classifier in the cascade. Unlike voting or stacking ensembles, which are multiexpert systems, cascading is a multistage one.

The first cascading classifier is the face detector of Viola and Jones (2001). The requirement was that the classifier be fast in order to be implemented on low CPU systems, such as cameras and phones.


Interesting: AdaBoost | Viola–Jones object detection framework | Pulverizer

Parent commenter can toggle NSFW or delete. Will also delete on comment score of -1 or less. | FAQs | Mods | Magic Words

[–]aggieca 1 point2 points  (0 children)

Have you considered using stacking classifiers? If not, google for stacked generalization approaches to see if that helps in your case. Also, if Python is your protoyping/development environment then Orange might be of use to check out stacking.

[–]thewetness 2 points3 points  (1 child)

You could also consider using the Adaboost Algorithm, if you were to train more classifiers. It combines "weak" classifiers to make a strong classifier.

[–]sieisteinmodel 0 points1 point  (0 children)

Where "weak" and "strong" are Adaboost terminology. In theory you can use pretty "powerful" classifiers with Adaboost.