Self-gated rectified linear unit for performance improvement of deep neural networks
Tài liệu tham khảo
V. Nair, G.E. Hinton, Rectified linear units improve restricted boltzmann machines, in: Proc. International Conference on Machine Learning, Haifa, Israel, 2010, pp. 807–814.
X. Glorot, A. Bordes, Y. Bengio, Deep sparse rectifier neural networks, in: Proc. International Conference on Artificial Intelligence and Statistics Conference, Ft. Lauderdale, FL, USA, 2011.
Apicella, 2021, A survey on modern trainable activation functions, Neural Netw., 138, 14, 10.1016/j.neunet.2021.01.026
Clevert, 2015
Klambauer, 2017
Ramachandran, 2017
Lu, 2020, Dying ReLU and initialization: Theory and numerical examples, Commun. Comput. Phys., 28, 1671, 10.4208/cicp.OA-2020-0165
Elfwing, 2018, Sigmoid-weighted linear units for neural network function approximation in reinforcement learning, Neural Netw., 107, 3, 10.1016/j.neunet.2017.12.012
Sharma, 2017
Qiumei, 2019, Improved convolutional neural network based on fast exponentially linear unit activation function, IEEE Access, 7, 151359, 10.1109/ACCESS.2019.2948112
Hansen, 2019
Szandała, 2021, 203
Deng, 2012, The MNIST database of handwritten digit images for machine learning research [best of the web], IEEE Signal Process. Mag., 29, 141, 10.1109/MSP.2012.2211477
Xiao, 2017