Approximation of functions from Korobov spaces by deep convolutional neural networks
Tóm tắt
Từ khóa
Tài liệu tham khảo
Chui, C.K., Lin, S.B., Zhang, B., Zhou, D.X.: Realization of spatial sparseness by deep reLU nets with massive data. IEEE Trans. Neural Netw. Learn. Syst. 33, 229–243 (2022)
Chui, C.K., Lin, S.B., Zhou, D.X.: Deep neural networks for rotation-invariance approximation and learning. Anal. Appl. 17, 737–772 (2019)
Eldan, R., Shamir, O.: The power of depth for feedforward neural networks. In: 29th Annual Conference on Learning Theory, PMLR, vol. 49, pp. 907–940 (2016)
Fang, Z., Feng, H., Huang, S., Zhou, D.X.: Theory of deep convolutional neural networks II: spherical analysis. Neural Netw. 131, 154–162 (2020)
Feng, H., Hou, S.Z., Wei, L.Y., Zhou, D.X.: CNN models for readability of Chinese texts. Math. Found. Comp. 5, 351–362 (2022)
Hinton, G.E., Osindero, S., Teh, Y.W.: A fast learning algorithm for deep belief nets. Neural Comput. 18, 1527–1554 (2006)
Hoefler, T., Alistarh, D., Ben-Nun, T., Dryden, N., Peste, A.: Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks. J. Mach. Learn. Res. 22, 1–124 (2021)
Klusowski, J.M., Barron, A.R.: Approximation by combinations of reLU and squared reLU ridge functions with ℓ1 and ℓ0 controls. IEEE Trans. Inf. Theory 64, 7649–7656 (2018)
Kohler, M., Krzyżak, A.: Nonparametric regression based on hierarchical interaction models. IEEE Trans. Inf. Theory 63, 1620–1630 (2016)
Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. Commun. ACM 60, 84–90 (2012)
Liang, S., Srikant, R.: Why deep neural networks for function approximation?. In: Proceedings of international conference on learning representations (2017)
Lin, S.B.: Generalization and expressivity for deep nets. IEEE Trans. Neural Netw. Learn Syst. 30, 1392–1406 (2019)
Mao, T., Shi, Z.J., Zhou, D.X.: Theory of deep convolutional neural networks III: Approximating radial functions. Neural Netw. 144, 778–790 (2021)
Mhaskar, H.N.: Approximation properties of a multilayered feedforward artificial neural network. Adv. Comput. Math. 1, 61–80 (1993)
Montanelli, H., Du, Q.: New error bounds for deep reLU networks using sparse grids. SIAM Journal on Mathematics of Data Science 1, 78–92 (2019)
Pinkus, A.: Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Netw. 6, 861–867 (1993)
Poggio, T., Mhaskar, H.N., Rosasco, L., Miranda, B., Liao, Q.: Why and when can deep—but not shallow—networks avoid the curse of dimensionality: a review. Internat. J. Automation Comput. 14, 503–519 (2017)
Telgarsky, M.: Benefits of depth in neural networks. In: 29th Annual Conference on Learning Theory, PMLR, vol. 49, pp. 1517–1539 (2016)
Yarotsky, D.: Error bounds for approximations with deep reLU networks. Neural Netw. 94, 103–114 (2017)
Zhou, D.X.: Universality of deep convolutional neural networks. Appl. Comput. Harmon. Anal. 48, 787–794 (2020)
Zhou, D.X.: Theory of deep convolutional neural networks: Downsampling. Neural Netw. 124, 319–327 (2020)
Zhou, D.X.: Deep distributed convolutional neural networks: universality. Anal. Appl. 16, 895–919 (2018)
Zhou, D.X. In: Webster, J. (ed.) : Deep Convolutional Neural Networks. Wiley Encyclopedia of Electrical and Electronics Engineering, Hoboken (2021). https://doi.org/10.1002/047134608X.W8424
Zhu, X.N., Li, Z.Y., Sun, J.: Expression recognition method combining convolutional features and Transformer, Math. Found. Comp., online first