Classification trees with soft splits optimized for ranking
Tóm tắt
We consider softening of splits in classification trees generated from multivariate numerical data. This methodology improves the quality of the ranking of the test cases measured by the AUC. Several ways to determine softening parameters are introduced and compared including softening algorithm present in the standard methods C4.5 and C5.0. In the first part of the paper, a few settings of softening determined only from ranges of training data in the tree branches are explored. The trees softened with these settings are used to study the effect of using the Laplace correction together with soft splits. In a later part we introduce methods which employ maximization of the classifier’s performance on the training set over the domain of the softening parameters. The non-linear optimization algorithm Nelder–Mead is used and various target functions are considered. The target function evaluating the AUC on the training set is compared with functions summing over training cases some transformation of the error of score. Several data sets from the UCI repository are used in experiments.
Tài liệu tham khảo
Breiman L, Friedman J, Olshen R, Stone C (1984) Classification and regression trees. Wadsworth and Brooks, Monterey
Carter C, Catlett J (1987) Assessing credit card applications using machine learning. IEEE Expert 2(3):71–79
Chen M, Ludwig SA (2013) Fuzzy decision tree using soft discretization and a genetic algorithm based feature selection method. In: 2013 World congress on nature and biologically inspired computing (NaBIC). IEEE, pp 238–244
Clémençon S, Depecker M, Vayatis N (2013) Ranking forests. J Mach Learn Res 14(1):39–73
Fawcett T (2006) An introduction to ROC analysis. Pattern Recognit Lett 27(8):861–874
Hanley JA, McNeil BJ (1982) The meaning and use of the area under a receiver operating characteristic (ROC) curve. Radiology 143(1):29–36
Hüllermeier E, Vanderlooy S (2009) Why fuzzy decision trees are good rankers. Trans Fuzzy Syst 17(6):1233–1244
Janikow CZ, Kawa K (2005) Fuzzy decision tree FID. In: Proceedings of NAFIPS, pp 379–384
Jordan MI, Jacobs RA (1994) Hierarchical mixtures of experts and the EM algorithm. Neural Comput 6(2):181–214
Kumar GK, Viswanath P, Rao AA (2016) Ensemble of randomized soft decision trees for robust classification. Sādhanā 41(3):273–282
Leisch F, Dimitriadou E (2009) mlbench: Machine Learning Benchmark Problems. R package version 1.1-6
Liaw A, Wiener M (2002) Classification and regression by randomForest. R News 2(3):18–22
Lichman M (2013) UCI machine learning repository. University of California, School of Information and Computer Sciences, Irvine. http://archive.ics.uci.edu/ml. Accessed 3 Feb 2016
Nelder JA, Mead R (1965) A simplex method for function minimization. Comput J 7(4):308–313
Norouzi M, Collins MD, Johnson M, Fleet DJ, Kohli P (2015) Efficient non-greedy optimization of decision trees. In: Cortes C, Lawrence ND, Lee DD, Sugiyama M, Garnett R (eds) Advances in neural information processing systems. MIT Press Cambridge, pp 1729–1737
Olaru C, Wehenkel L (2003) A complete fuzzy decision tree technique. Fuzzy Sets Syst 138(2):221–254
Otero FE, Freitas AA, Johnson CG (2012) Inducing decision trees with an ant colony optimization algorithm. Appl Soft Comput 12(11):3615–3626
Quinlan JR (1993) C4.5: programs for machine learning. Morgan Kaufmann Publishers Inc., San Francisco
Sofeikov KI, Tyukin IY, Gorban AN, Mirkes EM, Prokhorov DV, Romanenko IV (2014) Learning optimization for decision tree classification of non-categorical data with information gain impurity criterion. In: 2014 International joint conference on neural networks (IJCNN). IEEE, pp 3548–3555
Suárez A, Lutsko JF (1999) Globally optimal fuzzy decision trees for classification and regression. IEEE Trans Pattern Anal Mach Intell 21:1297–1311
Yıldız OT, İrsoy O, Alpaydın E (2016) Bagging soft decision trees. In: Holzinger A (ed) Machine learning for health informatics: state-of-the-art and future challenges. Springer, Cham, pp 25–36