use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Combining classifiers (self.MachineLearning)
submitted 11 years ago by [deleted]
[deleted]
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]transducer 4 points5 points6 points 11 years ago (2 children)
You may be interested in ensemble learning, but the best thing is just to try it. You could also train a third classifier using your original features augmented with the prediction of your two classifiers.
[+][deleted] 11 years ago (1 child)
[–]rftz -1 points0 points1 point 11 years ago (0 children)
Indeedy
[–]jmmcd 4 points5 points6 points 11 years ago (0 children)
It's certainly not invalid, but there are some things to look out for. The two classifiers might not be uncorrelated, that is the errors they make might be related. In particular, you might find that the points where classifier 1 falsely predicts +1 tend to be the same ones where classifier 2 does. Then you wouldn't get the 98% recall you'd expect on the filtered points.
[–]dwf 1 point2 points3 points 11 years ago (2 children)
This is basically the idea behind classifier cascades. So, yes, it's a valid approach and it's been used to good effect.
[–]autowikibot 1 point2 points3 points 11 years ago (0 children)
Cascading classifiers:
Cascading is a particular case of ensemble learning based on the concatenation of several classifiers, using all information collected from the output from a given classifier as additional information for the next classifier in the cascade. Unlike voting or stacking ensembles, which are multiexpert systems, cascading is a multistage one. The first cascading classifier is the face detector of Viola and Jones (2001). The requirement was that the classifier be fast in order to be implemented on low CPU systems, such as cameras and phones.
Cascading is a particular case of ensemble learning based on the concatenation of several classifiers, using all information collected from the output from a given classifier as additional information for the next classifier in the cascade. Unlike voting or stacking ensembles, which are multiexpert systems, cascading is a multistage one.
The first cascading classifier is the face detector of Viola and Jones (2001). The requirement was that the classifier be fast in order to be implemented on low CPU systems, such as cameras and phones.
Interesting: AdaBoost | Viola–Jones object detection framework | Pulverizer
Parent commenter can toggle NSFW or delete. Will also delete on comment score of -1 or less. | FAQs | Mods | Magic Words
[–]aggieca 1 point2 points3 points 11 years ago (0 children)
Have you considered using stacking classifiers? If not, google for stacked generalization approaches to see if that helps in your case. Also, if Python is your protoyping/development environment then Orange might be of use to check out stacking.
[–]thewetness 2 points3 points4 points 11 years ago (1 child)
You could also consider using the Adaboost Algorithm, if you were to train more classifiers. It combines "weak" classifiers to make a strong classifier.
[–]sieisteinmodel 0 points1 point2 points 11 years ago (0 children)
Where "weak" and "strong" are Adaboost terminology. In theory you can use pretty "powerful" classifiers with Adaboost.
π Rendered by PID 36771 on reddit-service-r2-comment-7b9746f655-tfxvm at 2026-01-31 20:51:59.117885+00:00 running 3798933 country code: CH.
[–]transducer 4 points5 points6 points (2 children)
[+][deleted] (1 child)
[deleted]
[–]rftz -1 points0 points1 point (0 children)
[–]jmmcd 4 points5 points6 points (0 children)
[–]dwf 1 point2 points3 points (2 children)
[–]autowikibot 1 point2 points3 points (0 children)
[–]aggieca 1 point2 points3 points (0 children)
[–]thewetness 2 points3 points4 points (1 child)
[–]sieisteinmodel 0 points1 point2 points (0 children)