Integrating complementary techniques for promoting diversity in classifier ensembles: A systematic study
Tài liệu tham khảo
Kuncheva, 2004
Polikar, 2006, Ensemble based systems in decision making, IEEE Circuits Syst. Mag., 6, 21, 10.1109/MCAS.2006.1688199
G. Brown, L.I. Kuncheva, “Good” and “bad” diversity in majority vote ensembles, in: N. Gayar, J. Kittler, F. Roli (Eds.), Multiple Classifier Systems—MCS 2010, Lecture Notes in Computer Science, vol. 5997, Springer, Berlin, 2010, pp. 124–133.
Kuncheva, 2003, Measures of diversity in classifier ensembles, Mach. Learn., 51, 181, 10.1023/A:1022859003006
Schapire, 2012
M. Re, G. Valentini, Ensemble methods: a review, in: Advances in Machine Learning and Data Mining for Astronomy, 2012, pp. 563–582
Ho, 1998, The random subspace method for constructing decision forests, IEEE Trans. Pattern Anal. Mach. Intell., 20, 832, 10.1109/34.709601
Achlioptas, 2003, Database-friendly random projections, J. Comput. Syst. Sci., 66, 671, 10.1016/S0022-0000(03)00025-4
A. Bertoni, G. Valentini, Discovering significant structures in clustered bio-molecular data through the bernstein inequality, in: Proceedings of the 11th International Conference, KES 2007 and XVII Italian Workshop on Neural Networks Conference on Knowledge-Based Intelligent Information and Engineering Systems: Part III, Springer-Verlag, Berlin, Heidelberg, 2007, pp. 886–891.
Q.L. Zhao, Y.J. Huang, M. Xu, Incremental learning by heterogeneous bagging ensemble, in: Proceedings of the 6th International Conference on Advanced Data Mining and Applications, ADMA 10, vol. II, Springer-Verlag, Berlin, Heidelberg, 2010, pp. 1–12.
Q.-L. Zhao, Y.-H. Jiang, M. Xu, Incremental learning based on ensemble pruning, in: 8th International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), vol. 1, 2011, pp. 377–381.
Brown, 2005, Diversity creation methods, J. Inf. Fusion, 6, 5, 10.1016/j.inffus.2004.04.004
Breiman, 2001, Random forests, Mach. Learn., 45, 5, 10.1023/A:1010933404324
Dietterich, 2000, An experimental comparison of three methods for constructing ensembles of decision trees, Mach. Learn., 40, 139, 10.1023/A:1007607513941
T.G. Dietterich, Ensemble methods in machine learning, in: Proceedings of the 1st International Workshop on Multiple Classifier Systems, Springer, London, UK, 2000, pp. 1–15.
Canuto, 2007, Investigating the influence of the choice of the ensemble members in accuracy and diversity of selection-based and fusion-based methods for ensembles, Pattern Recognit. Lett., 28, 472, 10.1016/j.patrec.2006.09.001
L.E.A. Santana, L. Silva, A.M.P. Canuto, F. Pintro, K.O. Vale, A comparative analysis of genetic algorithm and ant colony optimization to select attributes for an heterogeneous ensemble of classifiers, in: IEEE Congress on Evolutionary Computation (CEC), 2010, pp. 1–8.
Coelho, 2010, On the evolutionary design of heterogeneous bagging models, Neurocomputing, 73, 3319, 10.1016/j.neucom.2010.07.008
Nascimento, 2009, Ensembling heterogeneous learning models with boosting, vol. 5863, 512
Breiman, 1996, Bagging predictors, Mach. Learn., 24, 123, 10.1007/BF00058655
D.S.C. Nascimento, A.M.P. Canuto, L.M.M. Silva, A.L.V. Coelho, Combining different ways to generate diversity in bagging models: an evolutionary approach, in: International Joint Conference on Neural Networks (IJCNN), 2011, pp. 2235–2242.
Guyon, 2006
Eiben, 2007
Chen, 2002
Witten, 2005
Geman, 1992, Neural networks and the bias/variance dilemma, Neural Comput., 4, 1, 10.1162/neco.1992.4.1.1
Opitz, 1999, Popular ensemble methods, J. Artif. Intell. Res., 11, 169, 10.1613/jair.614
A. Asunción, D.J. Newman, UCI Machine Learning Repository, University of California at Irvine, 〈http://ics.uci.edu/~mlearn/MLRepository.html〉, 2007.
Kotsiantis, 2004, Combining bagging and boosting, Int. J. Comput. Intell., 1, 324
Demšar, 2006, Statistical comparisons of classifiers over multiple data sets, J. Mach. Learn. Res., 7, 1
Webb, 2000, Multiboosting, Mach. Learn., 40, 159, 10.1023/A:1007659514849
Brazdil, 2009