Modeling deterministic echo state network with loop reservoir
Tóm tắt
Echo state network (ESN), which efficiently models nonlinear dynamic systems, has been proposed as a special form of recurrent neural network. However, most of the proposed ESNs consist of complex reservoir structures, leading to excessive computational cost. Recently, minimum complexity ESNs were proposed and proved to exhibit high performance and low computational cost. In this paper, we propose a simple deterministic ESN with a loop reservoir, i.e., an ESN with an adjacent-feedback loop reservoir. The novel reservoir is constructed by introducing regular adjacent feedback based on the simplest loop reservoir. Only a single free parameter is tuned, which considerably simplifies the ESN construction. The combination of a simplified reservoir and fewer free parameters provides superior prediction performance. In the benchmark datasets and real-world tasks, our scheme obtains higher prediction accuracy with relatively low complexity, compared to the classic ESN and the minimum complexity ESN. Furthermore, we prove that all the linear ESNs with the simplest loop reservoir possess the same memory capacity, arbitrarily converging to the optimal value.
Tài liệu tham khảo
Abbasi Nozari, H., Dehghan Banadaki, H., Mokhtare, M., Hekmati Vahed, S., 2012. Intelligent non-linear modeling of an industrial winding process using recurrent local linear neuro-fuzzy networks. J. Zhejiang Univ.-Sci. C (Comput. & Electron.), 13(6):403–412. [doi:10.1631/jzus.C11a0278]
Atiya, A.F., Parlos, A.G., 2000. New results on recurrent network training: unifying the algorithms and accelerating convergence. IEEE Trans. Neur. Networks, 11(3):697–709. [doi:10.1109/72.846741]
Chatzis, S.P., Demiris, Y., 2011. Echo state Gaussian process. IEEE Trans. Neur. Networks, 22(9):1435–1445. [doi:10.1109/TNN.2011.2162109]
Deng, Z.D., Zhang, Y., 2007. Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Trans. Neur. Networks, 18(5):1364–1375. [doi:10.1109/TNN.2007.894082]
Henon, M., 1976. A 2-D mapping with a strange attractor. Commun. Math. Phys., 50(1):69–77. [doi:10.1007/BF01608556]
Hochreiter, S., Bengio, Y., Frasconi, P., Schmidhuber, J., 2001. Gradient Flow in Recurrent Nets: the Difficulty of Learning Long-Term Dependencies. In: Kolen, J., Kremer, S. (Eds.), A Field Guide to Dynamical Recurrent Networks. Wiley-IEEE Press, New York, p.237–243. [doi:10.1109/9780470544037.ch14]
Holzmann, G., Hauser, H., 2010. Echo state networks with filter neurons and a delay&sum readout. Neur. Networks, 23(2):244–256. [doi:10.1016/j.neunet.2009.07.004]
Ikeda, K., Daido, H., Akimoto, O., 1980. Optical turbulence: chaotic behavior of transmitted light from a ring cavity. Phys. Rev. Lett., 45(9):709–712. [doi:10.1103/PhysRevLett.45.709]
Jaeger, H., 2001. The ‘Echo State’ Approach to Analysing and Training Recurrent Neural Networks. Technical Report No. 148, German National Research Center for Information Technology, Bremen, Germany.
Jaeger, H., 2002a. A Tutorial on Training Recurrent Neural Networks, Covering BPPT, RTRL, EKF and the ‘Echo State Network’ Approach. Technical Report No. GMD 159, German National Research Center for Information Technology, Sankt Augustin, Germany.
Jaeger, H., 2002b. Short Term Memory in Echo State Networks. Technical Report No. GMD 152, German National Research Center for Information Technology, Sankt Augustin, Germany.
Jaeger, H., 2002c. Adaptive Nonlinear System Identification with Echo State Network. Advances in Neural Information Processing Systems, 15:593–600.
Jaeger, H., Hass, H., 2004. Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science, 304(5667):78–80. [doi:10.1126/science.1091277]
Lukosevicius, M., Jaeger, H., 2009. Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev., 3(3):127–149. [doi:10.1016/j.cosrev.2009.03.005]
Mandic, D.P., Chambers, J.A., 2001. Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability. Wiley, New York, p.31–47. [doi:10.1002/047084535X.ch3]
National Geophysical Data Center, 2007. Sunspot Numbers. Available from http://www.ngdc.noaa.gov/stp/iono/sunspot.html [Accessed on Sept. 23, 2011].
Ozturk, M.C., Principe, J.C., 2007. An associative memory readout for ESNs with applications to dynamical pattern recognition. Neur. Networks, 20(3):377–390. [doi:10.1016/j.neunet.2007.04.012]
Rodan, A., Tino, P., 2011. Minimum complexity echo state network. IEEE Trans. Neur. Networks, 22(1):131–144. [doi:10.1109/TNN.2010.2089641]
Salmen, M., Ploger, P.G., 2005. Echo State Networks Used for Motor Control. Proc. IEEE Int. Conf. on Robotics and Automation, p.1953–1958. [doi:10.1109/ROBOT.2005.1570399]
Schwenker, F., Labib, A., 2009. Echo State Networks and Neural Network Ensembles to Predict Sunspots Activity. 17th European Symp. on Artificial Neural Networks, p.379–384.
Shi, Z.W., Han, M., 2007. Support vector echo-state machine for chaotic time-series prediction. IEEE Trans. Neur. Networks, 18(2):359–372. [doi:10.1109/TNN.2006.885 113]
Siegelmann, H.T., Sontag, E.D., 1991. Turing computability with neural nets. Appl. Math. Lett., 4(6):77–80. [doi:10.1016/0893-9659(91)90080-F]
Steil, J.J., 2005. Memory in backpropagation-decorrelation O(N) efficient online recurrent learning. LNCS, 3697: 649–654. [doi:10.1007/11550907_103]
Tino, P., Schittenkopf, C., Dorffner, G., 2001. Financial volatility trading using recurrent neural networks. IEEE Trans. Neur. Networks, 12(4):865–874. [doi:10.1109/72.935096]
Wyffels, F., Schrauwen, B., Stroobandt, D., 2008. Stable output feedback in reservoir computing using ridge regression. LNCS, 5163:808–817. [doi:10.1007/978-3-540-87536-9_83]
Xia, Y.L., Jelfs, B., van Hulle, M.M., Principe, J.C., Mandic, D.P., 2011. An augmented echo state network for nonlinear adaptive filtering of complex noncircular signals. IEEE Trans. Neur. Networks, 22(1):74–83. [doi:10.1109/TNN.2010.2085444]
Xue, Y.B., Yang, L., Haykin, S., 2007. Decoupled echo state networks with lateral inhibition. Neur. Networks, 20(3): 365–376. [doi:10.1016/j.neunet.2007.04.014]
Zhang, B., Miller, D.J., Wang, Y., 2012. Nonlinear system modeling with random matrices: echo state networks revisited. IEEE Trans. Neur. Networks Learn. Syst., 23(1): 175–182. [doi:10.1109/TNNLS.2011.2178562]