Extreme learning machine: Theory and applications

Neurocomputing - Tập 70 - Trang 489-501 - 2006
Guang-Bin Huang1, Qin-Yu Zhu1, Chee-Kheong Siew1
1School of Electrical and Electronic Engineering, NanyangTechnological University, Nanyang Avenue, Singapore 639798, Singapore

Tài liệu tham khảo

Bartlett, 1998, The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network, IEEE Trans. Inf. Theory, 44, 525, 10.1109/18.661502 C. Blake, C. Merz, UCI repository of machine learning databases, in: 〈http://www.ics.uci.edu/∼mlearn/MLRepository.html〉, Department of Information and Computer Sciences, University of California, Irvine, USA, 1998. Collobert, 2002, A parallel mixtures of SVMs for very large scale problems, Neural Comput., 14, 1105, 10.1162/089976602753633402 Ferrari, 2005, Smooth function approximation using neural networks, IEEE Trans. Neural Networks, 16, 24, 10.1109/TNN.2004.836233 Y. Freund, R.E. Schapire, Experiments with a new boosting algorithm, in: International Conference on Machine Learning, 1996, pp. 148–156. Haykin, 1999 Hornik, 1991, Approximation capabilities of multilayer feedforward networks, Neural Networks, 4, 251, 10.1016/0893-6080(91)90009-T Hsu, 2002, A comparison of methods for multiclass support vector machines, IEEE Trans. Neural Networks, 13, 415, 10.1109/72.991427 G.-B. Huang, Learning capability of neural networks, Ph.D. Thesis, Nanyang Technological University, Singapore, 1998. Huang, 2003, Learning capability and storage capacity of two-hidden-layer feedforward networks, IEEE Trans. Neural Networks, 14, 274, 10.1109/TNN.2003.809401 Huang, 1998, Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions, IEEE Trans. Neural Networks, 9, 224, 10.1109/72.655045 Huang, 2000, Classification ability of single hidden layer feedforward neural networks, IEEE Trans. Neural Networks, 11, 799, 10.1109/72.846750 Huang, 2004, Extreme learning machine Huang, 2005, Extreme learning machine with randomly assigned RBF kernels, Int. J. Inf. Technol., 11 G.-B. Huang, Q.-Y. Zhu, K.Z. Mao, C.-K. Siew, P. Saratchandran, N. Sundararajan, Can threshold networks be trained directly?, IEEE Trans. Circuits Syst. II 53 (3) (2006) 187–191. G.-B. Huang, Q.-Y. Zhu, C.-K. Siew, Real-time learning capability of neural networks, Technical Report ICIS/45/2003, School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore, April 2003. Leshno, 1993, Multilayer feedforward networks with a nonpolynomial activation function can approximate any function, Neural Networks, 6, 861, 10.1016/S0893-6080(05)80131-5 Ortega, 1987 Rao, 1971 G. Rätsch, T. Onoda, K.R. Müller, An improvement of AdaBoost to avoid overfitting, in: Proceedings of the Fifth International Conference on Neural Information Processing (ICONIP’1998), 1998. E. Romero, A new incremental method for function approximation using feed-forward neural networks, in: Proceedings of INNS-IEEE International Joint Conference on Neural Networks (IJCNN’2002), 2002, pp. 1968–1973. Serre, 2002 Tamura, 1997, Capabilities of a four-layered feedforward neural network: four layers versus three, IEEE Trans. Neural Networks, 8, 251, 10.1109/72.557662 Wang, 2005, Protein sequence classification using extreme learning machine D.R. Wilson, T.R. Martinez, Heterogeneous radial basis function networks, in: Proceedings of the International Conference on Neural Networks (ICNN 96), June 1996, pp. 1263–1267. G.-B. Huang, L. Chen, C.-K. Siew, Universal approximation using incremental networks with random hidden computation nodes, IEEE Trans. Neural Networks 17 (4) 2006.