1 |
MORAVVEJ S V, MALEKI KAHAKI M J, SALIMI SARTAKHTI M, et al. Efficient GAN-based method for extractive summarization. Journal of Electrical and Computer Engineering Innovations, 2021, 10 (2): 287- 298.
|
2 |
|
3 |
朱永清, 赵鹏, 赵菲菲, 等. 基于深度学习的生成式文本摘要技术综述. 计算机工程, 2021, 47 (11): 11-21, 28.
URL
|
|
ZHU Y Q, ZHAO P, ZHAO F F, et al. Survey on abstractive text summarization technologies based on deep learning. Computer Engineering, 2021, 47 (11): 11-21, 28.
URL
|
4 |
SACKS H, SCHEGLOFF E A, JEFFERSON G. A simplest systematics for the organization of turn taking for conversation[M]//SCHENKEIN J. Studies in the organization of conversational interaction. New York, USA: Academic Press, 1978: 7-55.
|
5 |
GLIWA B, MOCHOL I, BIESEK M, et al. SAMSum corpus: a human-annotated dialogue dataset for abstractive summarization[C]//Proceedings of the 2nd Workshop on New Frontiers in Summarization. Stroudsburg, USA: Association for Computational Linguistics, 2019: 70-79.
|
6 |
ZHANG J T, XU Q. Attention-aware heterogeneous graph neural network. Big Data Mining and Analytics, 2021, 4 (4): 233- 241.
doi: 10.26599/BDMA.2021.9020008
|
7 |
RUSH A M, CHOPRA S, WESTON J. A neural attention model for abstractive sentence summarization[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: Association for Computational Linguistics, 2015: 379-389.
|
8 |
SEE A, LIU P J, MANNING C D. Get to the point: summarization with pointer-generator networks[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2017: 1-10.
|
9 |
CHEN Y C, BANSAL M. Fast abstractive summarization with reinforce-selected sentence rewriting[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2018: 675-686.
|
10 |
郭雨欣, 陈秀宏. 融合BERT词嵌入表示和主题信息增强的自动摘要模型. 计算机科学, 2022, 49 (6): 313- 318.
URL
|
|
GUO Y X, CHEN X H. Automatic summarization model combining BERT word embedding representation and topic information enhancement. Computer Science, 2022, 49 (6): 313- 318.
URL
|
11 |
|
12 |
金雨澄, 王清钦, 高剑, 等. 基于图深度学习的金融文本多标签分类算法. 计算机工程, 2022, 48 (4): 16- 21.
URL
|
|
JIN Y C, WANG Q Q, GAO J, et al. Multi-label financial text classification algorithm based on graph deep learning. Computer Engineering, 2022, 48 (4): 16- 21.
URL
|
13 |
SONG L F, ZHANG Y E, WANG Z G, et al. A graph-to-sequence model for AMR-to-text generation[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2018: 1616-1626.
|
14 |
FAN A, GARDENT C, BRAUD C, et al. Using local knowledge graph construction to scale Seq2Seq models to multi-document inputs[EB/OL]. [2022-10-05]. https://arxiv.org/abs/1910.08435v1.
|
15 |
HUANG L Y, WU L F, WANG L. Knowledge graph-augmented abstractive summarization with semantic-driven cloze reward[EB/OL]. [2022-10-05]. https://arxiv.org/abs/2005.01159.
|
16 |
DONG Y E, WANG S H, GAN Z, et al. Multi-fact correction in abstractive text summarization[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: Association for Computational Linguistics, 2020: 9320-9331.
|
17 |
ZHONG M, LIU P F, WANG D Q, et al. Searching for effective neural extractive summarization: what works and what's next[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2019: 1049-1058.
|
18 |
SHANG G K, DING W, ZHANG Z K, et al. Unsupervised abstractive meeting summarization with multi-sentence compression and budgeted submodular maximization[EB/OL]. [2022-10-05]. https://arxiv.org/pdf/1805.05271.pdf.
|
19 |
GOO C W, CHEN Y N. Abstractive dialogue summarization with sentence-gated modeling optimized by dialogue acts[C]//Proceedings of IEEE Spoken Language Technology Workshop. Washington D. C., USA: IEEE Press, 2019: 735-742.
|
20 |
LEI Y J, YAN Y M, ZENG Z Y, et al. Hierarchical speaker-aware sequence-to-sequence model for dialogue summarization[C]//Proceedings of International Conference on Acoustics, Speech and Signal Processing. Washington D. C., USA: IEEE Press, 2021: 7823-7827.
|
21 |
LI M L, ZHANG L Y, JI H, et al. Keep meeting summaries on topic: abstractive multi-modal meeting summarization[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2019: 2190-2196.
|
22 |
CHEN J A, YANG D Y. Multi-view sequence-to-sequence models with conversational structure for abstractive dialogue summarization[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: Association for Computational Linguistics, 2020: 4106-4118.
|
23 |
LIU C Y, WANG P, XU J, et al. Automatic dialogue summary generation for customer service[C]//Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. New York, USA: ACM Press, 2019: 1957-1965.
|
24 |
HU Z N, DONG Y X, WANG K S, et al. Heterogeneous graph transformer[C]//Proceedings of the International Conference on Word Wide Web. New York, USA: ACM Press, 2020: 2704-2710.
|
25 |
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. NewYork, USA: ACM Press, 2017: 5998-6008.
|
26 |
|