An eye-tracking attention based model for abstractive text headline

Cognitive Systems Research - Tập 58 - Trang 253-264 - 2019
Jiehang Xie, Xiaoming Wang, Xinyan Wang, Guangyao Pang, Xueyang Qin

Tài liệu tham khảo

Alsabahi, 2018, A hierarchical structured self-attentive model for extractive document summarization (hssas), IEEE Access, 6, 24205, 10.1109/ACCESS.2018.2829199 Barrett, M., Bingel, J., Keller, F., & Sogaard, A. (2016). Weakly supervised part-of-speech tagging using eye-tracking data. In Proceedings of the 54th annual meeting of the Association for Computational Linguistics (ACL), Berlin, Germany, Volume 2: Short papers. Bing, L., Li, P., Liao, Y., Lam, W., Guo, W., & Passonneau, R.J. (2015). Abstractive multi-document summarization via phrase selection and merging. In Proceedings of the 53rd annual meeting of the Association for Computational Linguistics (ACL), Beijing, China, Volume 1: Long papers (pp. 1587–1597). Cheng, J. & Lapata, M. (2016). Neural summarization by extracting sentences and words. In Proceedings of the 54th annual meeting of the Association for Computational Linguistics (ACL), Berlin, Germany, Volume 1: Long papers. doi:https://doi.org/10.18653/v1/p16-1046. Chen, 2018, An information distillation framework for extractive summarization, IEEE/ACM Transactions on Audio, Speech, and Language Processing, 26, 161, 10.1109/TASLP.2017.2764545 Chopra, S., Auli, M., & Rush, A. M. (2016). Abstractive sentence summarization with attentive recurrent neural networks. In The 2016 conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL), San Diego California (pp. 93–98). Cho, 2014, Learning phrase representations using RNN encoder-decoder for statistical machine translation, 1724 Cop, 2016, Presenting geco: An eyetracking corpus of monolingual and bilingual sentence reading, Behavior Research Methods, 49, 1 Demberg, 2008, Data from eye-tracking corpora as evidence for theories of syntactic processing complexity, Cognition, 109, 193, 10.1016/j.cognition.2008.07.008 Durrett, G., Berg-Kirkpatrick, T., & Dan, K. (2016). Learning-based single-document summarization with compression and anaphoricity constraints. In Proceedings of the 54th annual meeting of the Association for Computational Linguistics (ACL), Berlin, Germany, Volume 1: Long papers (pp. 1998–2008). Elman, 1990, Finding structure in time, Cognitive Science, 14, 179, 10.1207/s15516709cog1402_1 Garg, 2018, Identifying influential segments from word co-occurrence networks using ahp, Cognitive Systems Research, 47, 28, 10.1016/j.cogsys.2017.07.003 Glorot, 2010, Understanding the difficulty of training deep feedforward neural networks, 249 Hochreiter, 1997, Long short-term memory, Neural Computation, 9, 1735, 10.1162/neco.1997.9.8.1735 Joshi, A., Mishra, A., Senthamilselvan, N., & Bhattacharyya, P. (2014). Measuring sentiment annotation complexity of text. In Proceedings of the 52nd annual meeting of the association for Computational Linguistics (ACL), Baltimore, MD, USA, Volume 2: Short papers (pp. 36–41). Kennedy, 2003 Kingma, D. P. & Ba, J. (2014). Adam: A method for stochastic optimization. arXiv preprint arXiv:<1412.6980>. Lin, C. Y. (2004). Looking for a few good metrics: Rouge and its evaluation. In Proceedings of the Evaluation of Information Access Technologies (NTCIR), Tokyo, Japan. Ling, 2017, Coarse-to-fine attention models for document summarization, 33 Long, 2017, A cognition based attention model for sentiment analysis, 462 Marujo, 2016, Exploring events and distributed representations of text in multi-document summarization, Knowledge-Based Systems, 94, 33, 10.1016/j.knosys.2015.11.005 Mishra, 2017, Scanpath complexity: Modeling reading effort using gaze information, 77 Nallapati, R., Zhou, B., dos santos, C. N., Gulcehre, C., & Xiang, B. (2016). Abstractive text summarization using sequence-to-sequence rnns and beyond, (pp. 280–290). Napoles, 2012, Annotated gigaword, 95 Raffel, C., Luong, M., Liu, P. J., Weiss, R. J., & Eck, D. (2017). Online and linear-time attention by enforcing monotonic alignments. In Proceedings of the 34th International Conference on Machine Learning (ICML), Sydney, NSW, Australia (pp. 2837–2846). Reilly, 2006, Some empirical tests of an interactive activation model of eye movement control in reading, Cognitive Systems Research, 7, 34, 10.1016/j.cogsys.2005.07.006 Rush, 2015, A neural attention model for abstractive sentence summarization, 379 Toutanova, 2016, A dataset and evaluation metrics for abstractive compression of sentences and short paragraphs, 340 Wang, 2017, How far we can go with extractive text summarization? heuristic methods to obtain near upper bounds, Expert Systems with Applications, 90, 439, 10.1016/j.eswa.2017.08.040 Wang, 2015, Summarization based on task-oriented discourse parsing, IEEE/ACM Transactions on Audio, Speech, and Language Processing, 23, 1358, 10.1109/TASLP.2015.2432573 Yao, 2017, Recent advances in document summarization, Knowledge and Information Systems, 53, 297, 10.1007/s10115-017-1042-4 Zhou, Q., Yang, N., Wei, F., & Zhou, M. (2017). Selective encoding for abstractive sentence summarization. In Proceedings of the 55th annual meeting of the Association for Computational Linguistics (ACL), Vancouver, Canada, Volume 1: Long papers (pp. 1095–1104).