Towards convergence rate analysis of random forests for classification
Tài liệu tham khảo
Amaratunga, 2008, Enriched random forests, Bioinformatics, 24, 2010, 10.1093/bioinformatics/btn356
Amit, 1997, Shape quantization and recognition with randomized trees, Neural Comput., 9, 1545, 10.1162/neco.1997.9.7.1545
Arlot
Athey, 2019, Generalized random forests, Ann. Stat., 47, 1148, 10.1214/18-AOS1709
Audibert, 2007, Fast learning rates for plug-in classifiers, Ann. Stat., 35, 608, 10.1214/009053606000001217
Basu, 2018, Iterative random forests to discover predictive and stable high-order interactions, Proc. Natl. Acad. Sci., 115, 1943, 10.1073/pnas.1711236115
Belgiu, 2016, Random forest in remote sensing: a review of applications and future directions, ISPRS J. Photogramm. Remote Sens., 114, 24, 10.1016/j.isprsjprs.2016.01.011
Biau, 2012, Analysis of a random forests model, J. Mach. Learn. Res., 13, 1063
Biau, 2008, Consistency of random forests and other averaging classifiers, J. Mach. Learn. Res., 9, 2015
Biau, 2016, A random forest guided tour, Test, 25, 197, 10.1007/s11749-016-0481-7
Breiman, 2000
Breiman, 2001, Random forests, Mach. Learn., 45, 5, 10.1023/A:1010933404324
Breiman, 2004
Clémençon, 2013, Ranking forests, J. Mach. Learn. Res., 14, 39
Cover, 1967, Nearest neighbor pattern classification, IEEE Trans. Inf. Theory, 13, 21, 10.1109/TIT.1967.1053964
Criminisi, 2013
Criminisi, 2012, Decision forests: a unified framework for classification, regression, density estimation, manifold learning and semi-supervised learning, Found. Trends Comput. Graph. Vis., 7, 81
Cutler, 2007, Random forests for classification in ecology, Ecology, 88, 2783, 10.1890/07-0539.1
Denil, 2013, Consistency of online random forests, 1256
Denil, 2014, Narrowing the gap: random forests in theory and in practice, 665
Devroye, 1986, A note on the height of binary search trees, J. ACM, 33, 489, 10.1145/5925.5930
Devroye, 1996
Dietterich, 2000, An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization, Mach. Learn., 40, 139, 10.1023/A:1007607513941
Dinh, 2015, Learning from non-iid data: fast rates for the one-vs-all multiclass plug-in classifiers, 375
Fernández-Delgado, 2014, Do we need hundreds of classifiers to solve real world classification problems?, J. Mach. Learn. Res., 15, 3133
Gao, 2020, Towards convergence rate analysis of random forests for classification, 9300
Genuer, 2012, Variance reduction in purely random forests, J. Nonparametr. Stat., 24, 543, 10.1080/10485252.2012.677843
Genuer
Genuer, 2017, Random forests for big data, Big Data Res., 9, 28, 10.1016/j.bdr.2017.07.003
Geurts, 2006, Extremely randomized trees, Mach. Learn., 63, 3, 10.1007/s10994-006-6226-1
Goetz, 2018, Active learning for non-parametric regression using purely random trees, 2537
Györfi
Ho, 1998, The random subspace method for constructing decision forests, IEEE Trans. Pattern Anal. Mach. Intell., 20, 832, 10.1109/34.709601
Hoeffding, 1963, Probability inequalities for sums of bounded random variables, J. Am. Stat. Assoc., 301, 13, 10.1080/01621459.1963.10500830
Kazemitabar, 2017, Mondrian forests: efficient online random forests, 426
Klusowski
Kontorovich, 2014, Maximum margin multiclass nearest neighbors, 892
Kwok, 1988, Multiple decision trees, 327
Lakshminarayanan, 2014, Mondrian forests: efficient online random forests, 3140
Li, 2019, A debiased MDI feature importance measure for random forests, 8047
Lin, 2006, Random forests and adaptive nearest neighbors, J. Am. Stat. Assoc., 101, 578, 10.1198/016214505000001230
Louppe, 2013, Understanding variable importances in forests of randomized trees, 431
Meinshausen, 2006, Quantile regression forests, J. Mach. Learn. Res., 7, 983
Menze, 2011, On oblique random forests, 453
Mitzenmacher, 2005
Mourtada, 2017, Universal consistency and minimax rates for online Mondrian forests, 3758
Puchkin, 2020, An adaptive multiclass nearest neighbor classifier, ESAIM Probab. Stat., 24, 69, 10.1051/ps/2019021
Qi, 2012, Random forest for bioinformatics, 307
Reed, 2003, The height of a random binary search tree, J. ACM, 50, 306, 10.1145/765568.765571
Robnik-Šikonja, 2004, Improving random forests, 359
Rodriguez, 2006, Rotation forest: a new classifier ensemble method, IEEE Trans. Pattern Anal. Mach. Intell., 28, 1619, 10.1109/TPAMI.2006.211
Scornet, 2016, On the asymptotics of random forests, J. Multivar. Anal., 146, 72, 10.1016/j.jmva.2015.06.009
Scornet, 2015, Consistency of random forests, Ann. Stat., 43, 1716, 10.1214/15-AOS1321
Shalev-Shwartz, 2014
Shotton, 2013, Real-time human pose recognition in parts from single depth images, Commun. ACM, 56, 116, 10.1145/2398356.2398381
Svetnik, 2003, Random forest: a classification and regression tool for compound classification and QSAR modeling, J. Chem. Inf. Comput. Sci., 43, 1947, 10.1021/ci034160g
Taddy, 2011, Dynamic trees for learning and design, J. Am. Stat. Assoc., 106, 109, 10.1198/jasa.2011.ap09769
Tang, 2018, When do random forests fail?, 2983
Wager, 2018, Estimation and inference of heterogeneous treatment effects using random forests, J. Am. Stat. Assoc., 113, 1228, 10.1080/01621459.2017.1319839
Wager, 2014, Confidence intervals for random forests: the jackknife and the infinitesimal jackknife, J. Mach. Learn. Res., 15, 1625
Wang, 2016, Bernoulli random forests: closing the gap between theoretical consistency and empirical soundness, 2167
Wang, 2017, A novel consistent random forest framework: Bernoulli random forests, IEEE Trans. Neural Netw. Learn. Syst., 29, 3510
Yang, 2019, On the robust splitting criterion of random forest, 1420
Yang, 1999, Minimax nonparametric classification - part I: rates of convergence, IEEE Trans. Inf. Theory, 45, 2271, 10.1109/18.796368
Zhou, 2017, Deep forest: towards an alternative to deep neural networks, 3553
Zhou, 2019, Deep forest, Nat. Sci. Rev., 6, 74, 10.1093/nsr/nwy108