Quantum-chemical insights from deep tensor neural networks
Tóm tắt
Từ khóa
Tài liệu tham khảo
Kang, B. & Ceder, G. Battery materials for ultrafast charging and discharging. Nature 458, 190–193 (2009).
Nørskov, J. K., Bligaard, T., Rossmeisl, J. & Christensen, C. H. Towards the computational design of solid catalysts. Nat. Chem. 1, 37–46 (2009).
Hachmann, J. et al. The Harvard clean energy project: large-scale computational screening and design of organic photo-voltaics on the world community grid. J. Phys. Chem. Lett. 2, 2241–2251 (2011).
Pyzer-Knapp, E. O., Suh, C., Gomez-Bombarelli, R., Aguilera-Iparraguirre, J. & Aspuru-Guzik, A. What is high-throughput virtual screening? A perspective from organic materials discovery. Annu. Rev. Mater. Res. 45, 195–216 (2015).
Curtarolo, S. et al. The high-throughput highway to computational materials design. Nat. Mater. 12, 191–201 (2013).
Snyder, J. C., Rupp, M., Hansen, K., Müller, K.-R. & Burke, K. Finding density functionals with machine learning. Phys. Rev. Lett. 108, 253002 (2012).
Rupp, M., Tkatchenko, A., Muller, K.-R. & Von Lilienfeld, O. A. Fast and accurate modeling of molecular atomization energies with machine learning. Phys. Rev. Lett. 108, 058301 (2012).
Ramakrishnan, R., Dral, P. O., Rupp, M. & von Lilienfeld, O. A. Big data meets quantum chemistry approximations: the Δ-machine learning approach. J. Chem. Theory Comput. 11, 2087–2096 (2015).
Bishop, C. M. Pattern Recognition and Machine Learning Springer (2006).
Ghiringhelli, L. M., Vybiral, J., Levchenko, S. V., Draxl, C. & Scheffler, M. Big data of materials science: critical role of the descriptor. Phys. Rev. Lett. 114, 105503 (2015).
Schütt, K. et al. How to represent crystal structures for machine learning: towards fast prediction of electronic properties. Phys. Rev. B 89, 205118 (2014).
Montavon, G. et al. Machine learning of molecular electronic properties in chemical compound space. New J. Phys. 15, 095003 (2013).
Hansen, K. et al. Assessment and validation of machine learning methods for predicting molecular atomization energies. J. Chem. Theory Comput. 9, 3404–3419 (2013).
Hirn, M., Poilvert, N. & Mallat, S. Quantum energy regression using scattering transforms. Preprint at https://arxiv.org/abs/1502.02077 (2015).
Hansen, K. et al. Machine learning predictions of molecular properties: accurate many-body potentials and nonlocality in chemical space. J. Phys. Chem. Lett. 6, 2326 (2015).
Bartók, A. P., Kondor, R. & Csanyi, G. On representing chemical environments. Phys. Rev. B 87, 184115 (2013).
Bartók, A. P., Payne, M. C., Kondor, R. & Csanyi, G. Gaussian approximation potentials: the accuracy of quantum mechanics, without the electrons. Phys. Rev. Lett. 104, 136403 (2010).
Behler, J. Atom-centered symmetry functions for constructing high-dimensional neural network potentials. J. Chem. Phys. 134, 074106 (2011).
Behler, J. Neural network potential-energy surfaces in chemistry: a tool for large-scale simulations. Phys. Chem. Chem. Phys. 13, 17930–17955 (2011).
Montavon, G., Braun, M. L. & Müller, K.-R. Kernel analysis of deep networks. J. Mach. Learn. Res. 12, 2563–2581 (2011).
Ciresan, D., Meier, U. & Schmidhuber, J. Multi-column deep neural networks for image classification. In Proc. Conference on Computer Vision and Pattern Recognition. 3642–3649 (2012).
Krizhevsky, A., Sutskever, I. & Hinton, G. E. ImageNet classification with deep convolutional neural networks. In Proc. Advances in Neural Information Processing Systems. 25, 1097–1105 (2012).
LeCun, Y. & Bengio, Y. in The Handbook of Brain Theory and Neural Networks (ed. Arbib M.A.) 255–257 (The MIT Press, Cambridge, MA, USA, 1995).
Hinton, G. et al. Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Process. Mag. 29, 82–97 (2012).
Sainath, T. N. et al. Deep convolutional neural networks for large-scale speech tasks. Neural Netw. 64, 39–48 (2015).
Collobert, R. & Weston, J. A unified architecture for natural language processing: deep neural networks with multitask learning. In Proc. 25th International Conference on Machine Learning. 160–167 (2008).
Scarselli, F., Gori, M., Tsoi, A. C., Hagenbuchner, M. & Mon-fardini, G. The graph neural network model. IEEE Trans. Neural Netw. 20, 61–80 (2009).
Duvenaud, D. K. et al. Convolutional networks on graphs for learning molecular fingerprints. In Proc. Advances in Neural Information Processing Systems. 28, 2224–2232 (2015).
Socher, R. et al. Recursive deep models for semantic compositionality over a sentiment treebank. In Proc. of the conference on empirical methods in natural language processing (EMNLP) 1631–1642 (2013).
Sutskever, I., Martens, J. & Hinton, G. E. Generating text with recurrent neural networks. Proc. 28th Annu. Int. Conf. Mach. Learn. 1017–1024 (2011).
Socher, R., Chen, D., Manning, C. D. & Ng, A. Reasoning with neural tensor networks for knowledge base completion. In Proc. Advances in Neural Information Processing Systems. 26, 926–934 (2013).
Taylor, G. W. & Hinton, G. E. Factored conditional restricted Boltzmann machines for modeling motion style. In Proc. 26th Annual International Conference on Machine Learning. 1025–1032 (2009).
Blum, L. C. & Reymond, J.-L. 970 Million Druglike Small Molecules for Virtual Screening in the Chemical Universe Database GDB-13. J. Am. Chem. Soc. 131, 8732 (2009).
Ramakrishnan, R., Dral, P. O., Rupp, M. & von Lilienfeld, O. A. Quantum chemistry structures and properties of 134 kilo molecules. Sci. Data 1, 140022 (2014).
von Lilienfeld, O. A. First principles view on chemical compound space: gaining rigorous atomistic control of molecular properties. Int. J. Quantum Chem. 113, 1676–1689 (2013).
De, S., Bartok, A. P., Csanyi, G. & Ceriotti, M. Comparing molecules and solids across structural and alchemical space. Phys. Chem. Chem. Phys. 18, 13754–13769 (2016).
Malshe, M. et al. Development of generalized potential-energy surfaces using many-body expansions, neural networks, and moiety energy approximations. J. Chem. Phys. 130, 184102 (2009).
Manzhos, S. & Carrington, T. Jr A random-sampling high dimensional model representation neural network for building potential energy surfaces. J. Chem. Phys. 125, 084109 (2006).
Manzhos, S. & Carrington, T. Jr Using neural networks, optimized coordinates, and high-dimensional model representations to obtain a vinyl bromide potential surface. J. Chem. Phys. 129, 224104 (2008).
Behler, J. & Parrinello, M. Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev. Lett. 98, 146401 (2007).
Perdew, J. P., Ernzerhof, M. & Burke, K. Rationale for mixing exact exchange with density functional approximations. J. Chem. Phys. 105, 9982–9985 (1996).
Becke, A. D. Density-functional exchange-energy approximation with correct asymptotic behavior. Phys. Rev. A 38, 3098–3100 (1988).
Lee, C., Yang, W. & Parr, R. G. Development of the Colle- Salvetti correlation-energy formula into a functional ofthe electron density. Phys. Rev. B 37, 785–789 (1988).
Vosko, S. H., Wilk, L. & Nusair, M. Accurate spin-dependent electron liquid correlation energies for local spin density calculations: a critical analysis. Can. J. Phys. 58, 1200–1211 (1980).
Stephens, P., Devlin, F., Chabalowski, C. & Frisch, M. J. Ab initio calculation of vibrational absorption and circular dichro-ism spectra using density functional force fields. J. Phys. Chem. 98, 11623–11627 (1994).
Becke, A. d. Beckes 3 parameter functional combined with the non-local correlation LYP. J. Chem. Phys. 98, 5648 (1993).
Perdew, J. P., Burke, K. & Ernzerhof, M. Generalized gradient approximation made simple. Phys. Rev. Lett. 77, 3865–3868 (1996).
Glorot, X. & Bengio, Y. Understanding the difficulty of training deep feedforward neural networks. In Proc. 13th International Conference on Artificial Intelligence and Statistics. 249–256 (2010).
LeCun, Y. A., Bottou, L., Orr, G. B. & Müller, K.-R. in Neural Networks: Tricks of the Trade 9–48Springer (2012).