[1] KANJIRANGAT V, MELLACE S, ANTONUCCI A.Temporal embeddings and transformer models for narrative text understanding[C]//Proceedings of the 3rd Workshop on Narrative Extraction from Texts Co-located with the 42nd European Conference on Information Retrieval.Lisbon, Portugal:CEUR-WS, 2020:71-77. [2] TAKATSU H, YOKOYAMA K, MATSUYAMA Y, et al.Recognition of intentions of users' short responses for conversational news delivery system[C]//Proceedings of the 20th Annual Conference of the International Speech Communication Association.Graz, Austria:ISCA, 2019:1193-1197. [3] PICKERING T, JORDANOUS A.Applying narrative theory to aid unexpectedness in a story generation system[C]//Proceedings of the 8th International Conference on Computational Creativity.Georgia, USA:ACC, 2017:213-220. [4] CHAMBERS N, JURAFSKY D.Unsupervised learning of narrative event chains[C]//Proceedings of the 46th Annual Meeting of the Association for Computational Linguistics.Stroudsburg, USA:ACL, 2008:789-797. [5] JANS B, BETHARD S, VULIC I, et al.Skip n-grams and ranking functions for predicting script events[C]//Proceedings of the 13th Conference of the European Chapter of the Association for Computational Linguistics.Stroudsburg, USA:ACL, 2012:336-344. [6] GRANROTH-WILDING M, CLARK S.What happens next? Event prediction using a compositional neural network model[C]//Proceedings of the 13th AAAI Conference on Artificial Intelligence.Palo Alto, USA:AAAI, 2016:2727-2733. [7] DING X, LIAO K, LIU T, et al.Event representation learning enhanced with external commonsense knowledge[C]//Proceedings of 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing.Stroudsburg, USA:ACL, 2019:4893-4902. [8] VASWANI A, SHAZEER N, PARMAR N, at al.Attention is all you need[C]//Proceedings of NIPS'17.[S.l.]:NIPS, 2017:5998-6008. [9] HOCHREITER S, SCHMIDHUBER J.Long short-term memory[J].Neural Computation, 1997, 9(8):1735-1780. [10] SCARSELLI F, GORI M, TSOI A C, et al.The graph neural network model[J].IEEE Transactions on Neural Networks, 2009, 20(1):61-80. [11] SCHANK R C, ABELSON R P.Scripts, plans, goals, and understanding[M].New York, USA:Psychology Press, 2013. [12] BALASUBRAMANIAN N, SODERLAND S, MAUSAM, et al.Generating coherent event schemas at scale[C]//Proceedings of 2013 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:ACL, 2013:1721-1731. [13] PICHOTTA K, MOONEY R J.Statistical script learning with multi-argument events[C]//Proceedings of the 14th Conference of the European Chapter of the Association for Computational Linguistics.Stroudsburg, USA:ACL, 2014:220-229. [14] MIKOLOV T, SUTSKEVER I, CHEN K, et al.Distributed representations of words and phrases and their compositionality[C]//Proceedings of NIPS'13.[S.l.]:NIPS, 2013:3111-3119. [15] WANG Z Q, ZHANG Y, CHANG C Y.Integrating order information and event relation for script event prediction[C]//Proceedings of 2017 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:ACL, 2017:57-67. [16] LÜ S W, QIAN W H, HUANG L T, et al.SAM-Net:integrating event-level and chain-level attentions to predict what happens next[C]//Proceedings of the 33th AAAI Conference on Artificial Intelligence.Palo Alto, USA:AAAI, 2019:6802-6809. [17] LI Z Y, DING X, LIU T.Constructing narrative event evolutionary graph for script event prediction[C]//Proceedings of the 27th International Joint Conference on Artificial Intelligence.Stockholm, Sweden:IJCAI, 2018:4201-4207. [18] PENNINGTON J, SOCHER R, MANNING C.GloVe:global vectors for word representation[C]//Proceedings of 2014 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:ACL, 2014:1532-1543. [19] LI Y J, TARLOW D, BROCKSCHMIDT M, et al.Gated graph sequence neural networks[C]//Proceedings of the 4th International Conference on Learning Representations.Stroudsburg, USA:ACL, 2018:273-283. [20] CHEN Y B, YANG H, LIU K, et al.Collective event detection via a hierarchical and bias tagging networks with gated multi-level attention mechanisms[C]//Proceedings of 2018 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:ACL, 2018:1267-1276. [21] HU L M, LI J Z, NIE L Q, et al.What happens next? Future subevent prediction using contextual hierarchical LSTM[C]//Proceedings of the 31st AAAI Conference on Artificial Intelligence.Palo Alto, USA:AAAI, 2017:3450-3456. [22] HE H.HanLP:Han language processing[EB/OL].[2020-10-25].https://github.com/hankcs/HanLP. |