- very close: [[bernard_DynamicRandomForests_2012]] but naive, ad-hoc, no theoretical foundation
- update weights
- [[baumann_SequentialBoostingLearning_2015]] adapt number of training samples
- [[bernard_DynamicRandomForests_2012]], [[xu_ImplementationPerformanceOptimization_2017]] adapt weights based on performance of previous stage, based on number of trees (mis)classifying
- [[adnan_ImprovingRandomForest_2016]] increase the weights of hard to classify records in a training data set. We then build Random Forest from the weighted training data set.
- [[akash_IntroducingConfidenceWeight_2019]] infer confidence value (based on split node impurities) from tree and use that confidence score for weighted majority vote
- this is interesting since it sort of judges the greedy-optimisation path that the tree construction took -- higher confidence since better reduction / gropting steps in splits?
- train trees on data subsets / select attributes adaptively / random subspace
- [[kulkarni_EfficientLearningRandom_2013]] train trees on disjoint partitions etc
- [[adnan_ComplementRandomForest_2015]] random subspacing, build pairs of trees from mutually exclusive subset of (extended) attributes
- [[adnan_EffectsDynamicSubspacing_2017]] In the proposed technique, the number of attributes in $\boldsymbol{f}$ is dynamically increased with the decrease of records in the current data segmen
- [[panhalkar_NovelApproachBuild_2022]] and references
- [[akhand_DecisionTreeEnsemble_2014]] method incorporating some generated patterns with random subspace method
- [[melville_CreatingDiversityEnsembles_2005|DECORATE]]
- select trees
- [[zouggar_SimplifyingRandomForests_2019]] SFS and friends with different ad-hoc criteria
- [[adnan_OptimizingNumberTrees_2016]] genetic algorithm, uses different diversity measure though
- new split criteria
- [[kulkarni_WeightedHybridDecision_2016]]In this model, individual decision tree in Random Forest is generated using different split measures. (also theoretical analysis and comparison of split criteria)
- split not only in region [[panhalkar_NovelApproachBuild_2022]]
- different combiner (not really related)
- [[baumann_ThresholdingRandomForest_2014]] each leaf node has an individual weight. The final decision is not determined by majority voting but rather by a linear combination of individual weights leading to a better and more robust decision. -- also ad-hoc