Ensemble feature selection with the simple Bayesian classification
Tài liệu tham khảo
Asker, 1997, Ensembles as a sequence of classifiers, 860
Bauer, 1999, An empirical comparison of voting classification algorithms: bagging, boosting, and variants, Machine Learning, 36, 105, 10.1023/A:1007515423169
C.L. Blake, C.J. Merz, UCI repository of machine learning databases. Available from <http://www.ics.uci.edu/~mlearn/MLRepository.html>, University of California, Irvine CA, 1998
C. Brodley, T. Lane, Creating and exploiting coverage and diversity, in: Proc. AAAI-96 Workshop on Integrating Multiple Learned Models, 1996, pp. 8–14
Cordella, 1999, Reliability parameters to improve combination strategies in multi-expert systems, Pattern Analysis and Applications, 2, 205, 10.1007/s100440050029
Cost, 1993, A weighted nearest neighbor algorithm for learning with symbolic features, Machine Learning, 10, 57, 10.1007/BF00993481
Cunningham, 2000, Diversity versus quality in classification ensembles based on feature selection, vol. 1810, 109
Dietterich, 2001, Ensemble learning methods
Dietterich, 1997, Machine learning research: four current directions, AI Magazine, 18, 97
Domingos, 1996, Beyond independence: conditions for the optimality of the simple Bayesian classifier, 105
Domingos, 1997, On the optimality of the simple Bayesian classifier under zero-one loss, Machine Learning, 29, 103, 10.1023/A:1007413511361
Duda, 1973
C. Elkan, Boosting and naı̈ve Bayesian learning, Tech. Report CS97-557, Department of CS and Engineering, University of California, San Diego, USA, 1997
Fayyad, 1997
Friedman, 1997, On bias, variance, 0/1-loss, and the curse of dimensionality, Data Mining and Knowledge Discovery, 1, 55, 10.1023/A:1009778005914
Giacinto, 2001, Dynamic classifier selection based on multiple classifier behaviour, Pattern Recognition, 34, 1879, 10.1016/S0031-3203(00)00150-3
Giacinto, 1999, Methods for dynamic classifier selection, 659
Hansen, 1990, Neural network ensembles, IEEE Transactions on Pattern Analysis and Machine Intelligence, 12, 993, 10.1109/34.58871
Hastie, 1996, Discriminant adaptive nearest-neighbor classification, IEEE Transactions on PAMI, 18, 607, 10.1109/34.506411
Ho, 1998, The random subspace method for constructing decision forests, IEEE Transactions on Pattern Analysis and Machine Intelligence, 20, 832, 10.1109/34.709601
Kleinberg, 2000, On the algorithmic implementation of stochastic discrimination, IEEE Transactions on PAMI, 22, 473, 10.1109/34.857004
Kohavi, 1996, Data mining using MLC++: a machine learning library in C++
Kohavi, 1995, A study of cross-validation and bootstrap for accuracy estimation and model selection
M. Koppel, S.P. Engelson, Integrating multiple classifiers by finding their areas of expertise, in: AAAI-96 Workshop On Integrating Multiple Learning Models, 1996, pp. 53–58
Krogh, 1995, Neural network ensembles, cross validation, and active learning, vol. 7, 231
Merz, 1996, Dynamical selection of learning algorithms
Minsky, 1969
Opitz, 1999, Popular ensemble methods: an empirical study, Journal of AI Research, 11, 169
Opitz, 1999, Feature selection for ensembles, 379
T. Pedersen, A simple approach to building ensembles of naive Bayesian classifiers for word sense disambiguation, in: Proc. 1st Annual Meeting of the North American Chapter of the Association for Computational Linguistics, Seattle, WA, 2000, pp. 63–69
Puuronen, 1999, A dynamic integration algorithm for an ensemble of classifiers, vol. 1609, 592
Puuronen, 2001, Local feature selection with dynamic integration of classifiers, Fundamenta Informaticae, Special Issue, Intelligent Information Systems, 47, 91
G. Ridgeway, D. Madigan, T. Richardson, J. O’Kane, Interpretable boosted naive Bayes classification, in: Proc. 4th Int. Conf. On Knowledge Discovery and Data Mining KDD’98, 1998, pp. 101–104
M. Skurichina, R.P.W. Duin, Bagging and the random subspace method for redundant feature spaces, in: J. Kittler, F. Roli (Eds.), Proc. 2nd Int. Workshop on Multiple Classifier Systems MCS 2001, Cambridge, UK, 2001, pp. 1–10
Todorovski, 2000, Combining multiple models with meta decision trees, vol. 1910, 54
A. Tsymbal, S. Puuronen, I. Skrypnyk, Ensemble feature selection with dynamic integration of classifiers, in: Proc. Int. ICSC Congress on Computational Intelligence Methods and Applications CIMA’2001, Bangor, Wales, UK, 2001, pp. 558–564
Tumer, 1996, Error correlation and error reduction in ensemble classifiers, Connection Science, Special Issue on Combining Artificial Neural Networks: Ensemble Approaches, 8, 385
Witten, 1999
Wolpert, 1992, Stacked generalization, Neural Networks, 5, 241, 10.1016/S0893-6080(05)80023-1
Woods, 1997, Combination of multiple classifiers using local accuracy estimates, IEEE Transactions on PAMI, 19, 405, 10.1109/34.588027
Zenobi, 2001, Using diversity in preparing ensembles of classifiers based on different feature subsets to minimize generalization error, vol. 2167, 576
Zheng, 1998, Naı̈ve Bayesian classifier committees, vol. 1398, 196