Consistency measures for feature selection

Antonio Araúzo-Azofra1, Jose M. Benítez2, Juan Luis Castro2
1Department of Rural Engineering, University of Cordoba, Cordoba, Spain
2Department of Computer Science and Artificial Intelligence, University of Granada, Granada, Spain

Tóm tắt

Từ khóa


Tài liệu tham khảo

Almuallim, H., & Dietterich, T. G. (1991). Learning with many irrelevant features. In Proceedings of the ninth national conference on artificial intelligence ( AAAI-91) , Anaheim, CA, vol. 2 (pp. 547–552). Menlo Park, CA: AAAI Press.

Almuallim, H., & Dietterich, T. G. (1994). Learning boolean concepts in the presence of many irrelevant features. Artificial Intelligence, 69(1, 2), 279–305.

Arauzo Azofra, A., Benitez, J. M., & Castro, J. L. (2003a). C-FOCUS: A continuous extension of FOCUS. In Proceedings of the 7th online world conference on soft computing in industrial applications (pp. 225–232).

Arauzo Azofra, A., Benitez-Sanchez, J. M., & Castro-Peña, J. L. (2003b). A feature selection algorithm with fuzzy information. In Proceedings of the 10th IFSA world congress (pp. 220–223).

Blum, A. L., & Langley, P. (1997). Selection of relevant features and examples in machine learning. Artificial Intelligence, 97, 245–271.

Boros, E., Hammer, P. L., Ibaraki, T., Kogan, A., Mayoraz, E., & Muchnik, I. (2000). An implementation of logical analysis of data. IEEE Transactions on Knowledge Discovery and Data Engineering, 12(2), 292–306.

Brill, F. Z., Brown, D. E., & Martin, W. N. (1992). Fast genetic selection of features for neural network classifiers. IEEE Transactions on Neural Networks, 3(2), 324–328.

Chmielewski, M. R., & Grzymala-Busse, J. W. (1996). Global discretization of constinuous attributes as preprocessing for machine learning. International Journal of Approximate Reasoning, 15(4), 319–331.

Dash, M. (1997). Feature selection via set cover. In IEEE Knowledge and Data Engineering Exchange Workshop.

Dash, M., & Liu, H. (1997). Feature selection for classification. Intelligent Data Analysis, 1(1–4), 131–156.

Dash, M., & Liu, H. (2003). Consistency-based search in feature selection. Artificial Intelligence, 151(1, 2), 155–176.

Demsar, J., & Zupan, B. (2004). Orange: From experimental machine learning to interactive data mining. (White paper) http://www.ailab.si/orange .

Hettich, S., & Bay, S. D. (1999). The uci kdd archive. http://kdd.ics.uci.edu/ .

Jain, A., & Zongker, D. (1997). Feature selection: Evaluation, application, and small sample performance. IEEE transactions on pattern analysis and machine intelligence, 19(2), 153–158.

John, G. H., Kohavi, R., & Pfleger, K. (1994). Irrelevant features and the subset selection problem. In International conference on machine learning, (pp. 121–129). Journal version in AIJ, available at http://citeseer.nj.nec.com/13663.html .

Kira, K., & Rendell, L. A. (1992). A practical approach to feature selection. In Proceedings of the ninth international workshop on machine learning (pp. 249–256). San MAteo, CA, Morgan Kaufmann.

Kohavi, R. (1994). Feature subset selection as search with probabilistic estimates. In AAAI fall symposium on relevance (pp. 122–126).

Kohavi, R., & John, G. H. (1997). Wrappers for feature subset selection. Artificial Intelligence, 97(1, 2), 273–324.

Komorowski, J., Pawlak, Z., Polkowski, L., & Skowron, A. (1998). Rough sets: A tutorial. In S. K. Paland, & A. Skowron (Eds.) Rough-fuzzy hybridization: A new trend in decision-making (pp. 3–98). Singapore: Springer.

Kudo, M., & Sklansky, J. (2000). Comparison of algorithms that select features for pattern classifiers. Pattern Recognition, 33(1), 25–41.

Langley, P. (1994). Selection of relevant features in machine learning. In Procedings of the AAAI fall symposium on relevance, New Orleans, LA. Menlo Park, CA: AAAI Press.

Liu, H., Hussain, F., Tan, C. L., & Dash, M. (2002). Discretization: An enabling technique. Data Mining and Knowledge Discovery, 6, 393–423.

Liu, H., Motoda, H., & Dash, M. (1998). A monotonic measure for optimal feature selection. In European conference on machine learning (pp. 101–106).

Liu, H., & Setiono, R. (1997). Feature selection via discretization. Knowledge and Data Engineering, 9(4), 642–645.

Modrzejewski, M. (1993). Feature selection using rough sets theory. In Proceedings of the European conference on machine learning (pp. 213–216).

Oliveira, A., & Sangiovanni-Vicentelli, A. (1992). Constructive induction using a non-greedy strategy for feature selection. In Proceedings of ninth international conference on machine learning, Aberdeen, Scotland (pp. 355–360). San Mateo, CA: Morgan Kaufmann.

Pawlak, Z. (1991). Rough sets, theoretical aspects of reasoning about data. Boston, MA: Kluwer.

Polkowski, L., & Skowron, A., (Eds.) (1998). Rough sets in knowledge discovery. Heidelberg: Physica Verlag.

Schlimmer, J. (1993). Efficiently inducing determinations: A complete and systematic search algorithm that uses optimal pruning. In Proceedings of tenth international conference on machine learning (pp. 289–290).

Somol, P. & Pudil, P. (2004). Fast branch & bound algorithms for optimal feature selection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 26(7), 900–912.

Tay, F. E. H., & Shen, L. (2002). A modified chi2 algorithm for discretization. Knowledge and Data Engineering, 14(3), 666–670.

Wettschereck, D., Aha, D. W., & Mohri, T. (1997). A review and empirical evaluation of feature weighting methods for a class of lazy learning algorithms. Artificial Intelligence Review, 11(1-5), 273–314.

Zhong, N., Dong, J., & Ohsuga, S. (2001). Using rough sets with heuristics for feature selection. Journal of Intelligent Information Systems, 16(3), 199–214.