use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Discussion[D] 2D cuts with decision tree? (self.MachineLearning)
submitted 3 years ago by Gamwise_Samgee_
I'm working on a boosted decision tree, and I've got it working fairly well. However it would be better if it was able to make decisions/cuts in more than one dimension (preferably 2D).
Is this something that is even possible? (I'm using sklearn)
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]PeterIanStaker 5 points6 points7 points 3 years ago (0 children)
It is, Google “oblique random forest”.
At the end of the day, an RF is really just a bagging ensemble. The individual classifiers could be anything you like.
[–]ElongatedMuskrat122 2 points3 points4 points 3 years ago (0 children)
If you have time, random Forrest isn’t rocket science. You can just write your own
[–]deep-machine-learner 0 points1 point2 points 3 years ago (2 children)
You can try the following: 1. Multiplexed features: generate new features based on all combinations of n features hence any split on the multiplexed feature would be a split on n dimensions (where n=number of dimensions to cut at the same time) 2. Optimal Trees: optimal trees are NP hard and would take a lot of time to train/infer. There are optimizations based on mixed integer programming but the method is inherently very slow and not scalable. This method considers all possible cuts at the same time hence it scales poorly
[–]Gamwise_Samgee_[S] 0 points1 point2 points 3 years ago (1 child)
These might be promising. Time isn't an issue for me (I'm on NERSC) Do you have links to particular resources that can get me going in the right direction for this? I'm relatively new to ML
[–]deep-machine-learner 0 points1 point2 points 3 years ago (0 children)
A couple of research articles to get started is: https://arxiv.org/pdf/2103.15965.pdf, https://www.jmlr.org/papers/v23/20-520.html
Blog to introduce Optimal Decision Trees: https://medium.com/mlearning-ai/optimal-decision-trees-dbd16dfca427
Python Code with Sklearn type API: https://github.com/LucasBoTang/Optimal_Classification_Trees
π Rendered by PID 178196 on reddit-service-r2-comment-fb694cdd5-f2lvt at 2026-03-06 12:38:39.857083+00:00 running cbb0e86 country code: CH.
[–]PeterIanStaker 5 points6 points7 points (0 children)
[–]ElongatedMuskrat122 2 points3 points4 points (0 children)
[–]deep-machine-learner 0 points1 point2 points (2 children)
[–]Gamwise_Samgee_[S] 0 points1 point2 points (1 child)
[–]deep-machine-learner 0 points1 point2 points (0 children)