[1] 刘挺, 吴岩, 王开铸.自动文摘综述[J].情报科学, 1998, 16(1):63-69. LIU T, WU Y, WANG K Z.Summary of automation abstract[J].Information Science, 1998, 16(1):63-69.(in Chinese) [2] ROMAN N T, PIWEK P, CARVALHO A M B R.Politeness and bias in dialogue summarization:two exploratory studies[M]//STEFANOV V, TRAIT J.The information retrieval series, Berlin, Germany:Springer, 2006:171-185. [3] 朱永清, 赵鹏, 赵菲菲, 等.基于深度学习的生成式文本摘要技术综述[J].计算机工程, 2021, 47(11):11-21, 28. ZHU Y Q, ZHAO P, ZHAO F F, et al.Survey on abstractive text summarization technologies based on deep learning[J].Computer Engineering, 2021, 47(11):11-21, 28.(in Chinese) [4] CHO K, VAN MERRIENBOER B, GULCEHRE C, et al.Learning phrase representations using RNN encoder-decoder for statistical machine translation[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:Association for Computational Linguistics, 2014:1-10. [5] 王红玲, 周国栋, 朱巧明.面向冗余度控制的中文多文档自动文摘[J].中文信息学报, 2012, 26(2):92-96. WANG H L, ZHOU G D, ZHU Q M.Chinese multi-document summarization based on redundancy control[J].Journal of Chinese Information Processing, 2012, 26(2):92-96.(in Chinese) [6] PAGE L, BRIN S, MOTWANI R, et al.The PageRank citation ranking:bringing order to the web[C]//Proceedings of Technical Report, Stanford Digital Library Technologies.Stroudsburg, USA:Association for Computational Linguistics, 1998:1-10. [7] MIHALCEA R, TARAU P.TextRank:bringing order into texts[EB/OL].[2022-03-01].https://www.researchgate.net/profile/Paul-Tarau/publication/200042361_TextRank_Bringing_Order_into_Text/links/0912f508a98af2fe24000000/TextRank-Bringing-Order-into-Text.pdf. [8] CHENG J P, LAPATA M.Neural summarization by extracting sentences and words[EB/OL].[2022-03-01].https://arxiv.org/pdf/1603.07252.pdf. [9] LIU Y.Fine-tune BERT for extractive summarization[EB/OL].[2022-03-01].https://arxiv.org/pdf/1903.10318.pdf. [10] RUSH A M, CHOPRA S, WESTON J.A neural attention model for abstractive sentence summarization[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:Association for Computational Linguistics, 2015:379-389. [11] GU J T, LU Z D, LI H, et al.Incorporating copying mechanism in sequence-to-sequence learning[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.Stroudsburg, USA:Association for Computational Linguistics, 2016:1631-1640. [12] SEE A, LIU P J, MANNING C D.Get to the point:summarization with pointer-generator networks[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics.Stroudsburg, USA:Association for Computational Linguistics, 2017:1073-1083. [13] GULCEHRE C, AHN S, NALLAPATI R, et al.Pointing the unknown words[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.Stroudsburg, USA:Association for Computational Linguistics, 2016:140-149. [14] ZENG W Y, LUO W J, FIDLER S, et al.Efficient summarization with read-again and copy mechanism[EB/OL].[2022-03-01].https://arxiv.org/pdf/1611. 03382.pdf. [15] COHAN A, DERNONCOURT F, KIM D S, et al.A discourse-aware attention model for abstractive summarization of long documents[EB/OL].[2022-03-01].https://arxiv.org/pdf/1804.05685.pdf. [16] DEVLIN J, CHANG M W, LEE K, et al.BERT:pre-training of deep bidirectional Transformers for language understanding[EB/OL].[2022-03-01].https://arxiv.org/pdf/1810.04805.pdf. [17] VASWANI A, SHAZEER N, PARMAR N, et al.Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.New York, USA:ACM Press, 2017:5998-6008. [18] LIU Y, LAPATA M.Text summarization with pretrained encoders[EB/OL].[2022-03-01].https://arxiv.org/abs/1908. 08345. [19] DONG L, YANG N, WANG W H, et al.Unified language model pre-training for natural language understanding and generation[EB/OL].[2022-03-01].https://arxiv.org/abs/1905.03197. [20] LEWIS M, LIU Y H, GOYAL N, et al.BART:denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.Stroudsburg, USA:Association for Computational Linguistics, 2020:1-10. [21] 吴仁守, 张宜飞, 王红玲, 等.基于层次结构的生成式自动文摘[J].中文信息学报, 2019, 33(10):90-98. WU R S, ZHANG Y F, WANG H L, et al.Abstractive summarization based on hierarchical structure[J].Journal of Chinese Information Processing, 2019, 33(10):90-98.(in Chinese) [22] 张迎, 王中卿, 王红玲.基于篇章主次关系的单文档抽取式摘要方法研究[J].中文信息学报, 2019, 33(8):67-76. ZHANG Y, WANG Z Q, WANG H L.Single document extractive summarization with satellite and nuclear relations[J].Journal of Chinese Information Processing, 2019, 33(8):67-76.(in Chinese) [23] WEI W J, WANG H L, WANG Z Q.Abstractive summarization via discourse relation and graph convolutional networks[C]//Proceedings of International Conference on Natural Language Processing and Chinese Computing.Berlin, Germany:Springer, 2020:331-342. [24] GOO C W, CHEN Y N.Abstractive dialogue summarization with sentence-gated modeling optimized by dialogue acts[C]//Proceedings of Spoken Language Technology Workshop.Washington D.C., USA:IEEE Press, 2019:735-742. [25] LI M L, ZHANG L Y, JI H, et al.Keep meeting summaries on topic:abstractive multi-modal meeting summarization[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.Stroudsburg, USA:Association for Computational Linguistics, 2019:1-10. [26] ZHU C G, XU R C, ZENG M, et al.A hierarchical network for abstractive meeting summarization with cross-domain pretraining[C]//Proceedings of Findings of the Association for Computational Linguistics:EMNLP 2020.Stroudsburg, USA:Association for Computational Linguistics, 2020:1-10. [27] ZHAO L L, XU W R, GUO J.Improving abstractive dialogue summarization with graph structures and topic words[C]//Proceedings of the 28th International Conference on Computational Linguistics.Stroudsburg, USA:International Committee on Computational Linguistics, 2020:1-10. [28] CHEN J A, YANG D Y.Multi-view sequence-to-sequence models with conversational structure for abstractive dialogue summarization[C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:Association for Computational Linguistics, 2020:1-10. [29] FENG X C, FENG X C, QIN B, et al.Dialogue discourse-aware graph model and data augmentation for meeting summarization[C]//Proceedings of the 13th International Conference on Artificial Intelligence.Berlin, Germany:[s.n.]:1-15. [30] FENG X C, FENG X C, QIN B.Incorporating commonsense knowledge into abstractive dialogue summarization via heterogeneous graph networks[C]//Proceedings of China National Conference on Chinese Computational Linguistics.Berlin, Germany:Springer, 2021:127-142. [31] FENG X C, FENG X C, QIN L B, et al.Language model as an annotator:exploring DialoGPT for dialogue summarization[EB/OL].[2022-03-01].https://arxiv.org/abs/2105.12544. [32] WU L F, CHEN Y, SHEN K, et al.Graph neural networks for natural language processing:a survey[EB/OL].[2022-03-01].https://arxiv.org/abs/2106.06090. [33] LIN C Y, HOVY E.Automatic evaluation of summaries using N-gram co-occurrence statistics[C]//Proceedings of Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology.Morristown, USA:Association for Computational Linguistics, 2003:150-157. |