Vapnik-Chervonenkis dimension of recurrent neural networks
Tóm tắt
Từ khóa
Tài liệu tham khảo
Baum, 1989, What size net gives valid generalization?, Neural Comput., 1, 151, 10.1162/neco.1989.1.1.151
Bengio, 1996
Blumer, 1989, Learnability and the Vapnik-Chervonenkis Dimension, J. ACM, 36, 929, 10.1145/76359.76371
Cover, 1968, Capacity problems for linear machines, 283
Dasgupta, 1996, Sample complexity for learning recurrent perceptron mappings, IEEE Trans. Inform. Theory, 42, 1479, 10.1109/18.532888
1996, 204
Giles, 1990, Higher order recurrent networks and grammatical inference
Goldberg, 1995, Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers, Machine Learning, 18, 131, 10.1007/BF00993408
Grossberg, 1987, 2 vols
Hopfield, 1982, Neural networks and physical systems with emergent computational abilities, 79, 2554
Karpinski, 1997, Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks, J. Comput. System Sci., 54, 169, 10.1006/jcss.1997.1477
Koiran, 1997, Neural networks with quadratic VC dimension, J. Comput. System Sci., 54, 190, 10.1006/jcss.1997.1479
Polycarpou, 1992, Neural networks and on-line approximators for adaptive control, 793
Siegelmann, 1995, On the computational power of neural nets, J. Comput. System Sci., 50, 132, 10.1006/jcss.1995.1013
Siegelmann, 1994, Analog computation, neural networks, and circuits, Theoret. Comput. Sci., 131, 331, 10.1016/0304-3975(94)90178-3
Sontag, 1990
Sontag, 1992, Neural nets as systems models and controllers, 73
Sontag, 1992, Feedforward nets for interpolation and classification, J. Comput. System Sci., 45, 20, 10.1016/0022-0000(92)90039-L