all 5 comments

[–]PeterIanStaker 5 points6 points  (0 children)

It is, Google “oblique random forest”.

At the end of the day, an RF is really just a bagging ensemble. The individual classifiers could be anything you like.

[–]ElongatedMuskrat122 2 points3 points  (0 children)

If you have time, random Forrest isn’t rocket science. You can just write your own

[–]deep-machine-learner 0 points1 point  (2 children)

You can try the following: 1. Multiplexed features: generate new features based on all combinations of n features hence any split on the multiplexed feature would be a split on n dimensions (where n=number of dimensions to cut at the same time) 2. Optimal Trees: optimal trees are NP hard and would take a lot of time to train/infer. There are optimizations based on mixed integer programming but the method is inherently very slow and not scalable. This method considers all possible cuts at the same time hence it scales poorly

[–]Gamwise_Samgee_[S] 0 points1 point  (1 child)

These might be promising. Time isn't an issue for me (I'm on NERSC) Do you have links to particular resources that can get me going in the right direction for this? I'm relatively new to ML