Spectral regularization for combating mode collapse in GANs

Image and Vision Computing - Tập 104 - Trang 104005 - 2020
Kanglin Liu1,2,3, Guoping Qiu1,2,3,4, Wenming Tang1,2,3, Fei Zhou1,2,3
1Shenzhen University, Shenzhen, China
2Guangdong Key Laboratory of Intelligent Information Processing, Shenzhen, China
3Shenzhen Institute of Artificial Intelligence and Robotics for Society, Shenzhen, China
4University of Nottingham, Nottingham, United Kingdom

Tài liệu tham khảo

Goodfellow, 2014, Generative adversarial nets, Adv. Neural Inf. Proces. Syst., 2672 Karras, 2019, A style-based generator architecture for generative adversarial networks, Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 4401 Huang, 2017, Arbitrary style transfer in real-time with adaptive instance normalization, Proceed. IEEE Int. Conf. Comp. Vision, 1501 Radford, 2015, Unsupervised representation learning with deep convolutional generative adversarial networks Arjovsky, 2017, Wasserstein gan Wu, 2017, Energy-relaxed wassertein gans (energywgan): Towards more stable and high resolution image generation Isogawa, 2018, Which is the better inpainted image? Training data generation without any manual operations, Int. J. Comput. Vis., 1 Jamaludin, 2019, You said that?: Synthesising talking faces from audio, Int. J. Comput. Vis., 1 Berthelot, 2017, Began: Boundary equilibrium generative adversar-ial networks Mao, 2017, Least squares generative ad-versarial networks, 2813 Chen, 2016, Infogan: interpretable representation learning by information maximizing generative adversarial nets, Adv. Neural Inf. Proces. Syst., 2172 Fedus, 2017, Many paths to equilibrium: Gans do not need to decrease a divergence at every step Andrew, 2018, Large scale gan training for high fidelity natural image synthesis Karras, 2017, Progressive growing of gans for improved quality, stability, and variation Kodali, 2017 Mescheder, 2018, Which training methods for gans do actually converge Gulrajani, 2017, Improved training of wasserstein gans, Adv. Neural Inf. Proces. Syst., 5769 Qi, 2020, 128, 1118 Miyato, 2018 Salimans, 2016, Weight normalization: a simple reparameterization to accelerate training of deep neural networks, Adv. Neural Inf. Proces. Syst., 901 Liu, 2019, Spectral regularization for combating mode collapse in gans, International Conference on Computer Vision, 6382 Brock, 2016, Neural photo editing with introspective adversarial networks Heinonen, 2005 Arpit, 2016, Normalization propagation: A parametric technique for removing internal covariate shift in deep networks Torralba, 2008, 80 million tiny images: a large data set for non-parametric object and scene recognition, IEEE Trans. Pattern Anal. Mach. Intell., 30, 901, 10.1109/TPAMI.2008.128 Heusel, 2017, Gans trained by a two time-scale update rule converge to a Nash equilibrium, Adv. Neural Inf. Proces. Syst., 6626 Deng, 2009, Imagenet: A large-scale hierarchical image database, 248 Miyato, 2018, cgans with projection discriminator Kingma, 2014, Adam: a method for stochastic optimization Ioffe, 2015, Batch normalization: Accelerating deep network training by reducing internal covariate shift Salimans, 2016, Improved techniques for training gans, Adv. Neural Inf. Proces. Syst., 2234 Dowson, 1982, The fréchet distance between multivariate normal distributions, J. Multivar. Anal., 12, 450, 10.1016/0047-259X(82)90077-X Zhang, 2019, Self-attention generative adversarial networks, 7354