1 |
HUANG D D, CUI L Y, YANG S, et al. What have we achieved on text summarization?[C]//Proceedings of 2020 Conference on Empirical Methods in Natural Language Processing. Washington D. C., USA: Association for Computational Linguistics, 2020: 446-469.
|
2 |
李金鹏, 张闯, 陈小军, 等. 自动文本摘要研究综述. 计算机研究与发展, 2021, 58 (1): 1- 21.
doi: 10.7544/issn1000-1239202120190785
|
|
LI J P, ZHANG C, CHEN X J, et al. Survey on automatic text summarization. Journal of Computer Research and Development, 2021, 58 (1): 1- 21.
doi: 10.7544/issn1000-1239202120190785
|
3 |
EL-KASSAS W S, SALAMA C R, RAFEA A A, et al. Automatic text summarization: a comprehensive survey. Expert Systems with Applications, 2021, 165, 113679.
doi: 10.1016/j.eswa.2020.113679
|
4 |
SUTSKEVER I, VINYALS O, LE Q V. Sequence to sequence learning with neural networks[C]//Proceedings of the 27th International Conference on Neural Information Processing Systems. New York, USA: ACM Press, 2014: 3104-3112.
|
5 |
NALLAPATI R, ZHOU B W, DOS SANTOS C, et al. Abstractive text summarization using sequence-to-sequence RNNs and beyond[C]//Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning. Berlin, Germany: Springer, 2016: 280-290.
|
6 |
LIN C Y. ROUGE: a package for automatic evaluation of summaries[C]//Proceedings of ACLʼ04. Washington D. C., USA: IEEE Press, 2004: 74-81.
|
7 |
VINYALS O, FORTUNATO M, JAITLY N. Pointer networks[C]//Proceedings of the 28th International Conference on Neural Information Processing Systems. New York, USA: ACM Press, 2015: 2692-2700.
|
8 |
GU J T, LU Z D, LI H, et al. Incorporating copying mechanism in sequence-to-sequence learning[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, Germany: Springer, 2016: 1631-1640.
|
9 |
SEE A, LIU P J, MANNING C D. Get to the point: summarization with pointer-generator networks[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Washington D. C., USA: Association for Computational Linguistics, 2017: 1073-1083.
|
10 |
TU Z P, LU Z D, LIU Y, et al. Modeling coverage for neural machine translation[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, Germany: Springer, 2016: 76-85.
|
11 |
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. New York, USA: ACM Press, 2017: 6000-6010.
|
12 |
LIU Y, LAPATA M. Text summarization with pretrained encoders[C]//Proceedings of 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Washington D. C., USA: Association for Computational Linguistics, 2019: 3730-3740.
|
13 |
|
14 |
LEWIS M, LIU Y H, GOYAL N, et al. BART: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Washington D. C., USA: Association for Computational Linguistics, 2020: 7871-7880.
|
15 |
|
16 |
XIAO D L, ZHANG H, LI Y K, et al. ERNIE-GEN: an enhanced multi-flow pre-training and fine-tuning framework for natural language generation[C]//Proceedings of the 29th International Joint Conference on Artificial Intelligence. Yokohama, Japan: [s. n.], 2020: 357-369.
|
17 |
|
18 |
吴仁守, 王红玲, 王中卿, 等. 全局自匹配机制的短文本摘要生成方法. 软件学报, 2019, 30 (9): 2705- 2717.
doi: 10.13328/j.cnki.jos.005850
|
|
WU R S, WANG H L, WANG Z Q, et al. Short text summary generation with global self-matching mechanism. Journal of Software, 2019, 30 (9): 2705- 2717.
doi: 10.13328/j.cnki.jos.005850
|
19 |
LIN J Y, SUN X, MA S M, et al. Global encoding for abstractive summarization[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Washington D. C., USA: Association for Computational Linguistics, 2018: 163-169.
|
20 |
邓维斌, 李云波, 张一明, 等. 融合BERT和卷积门控的生成式文本摘要方法. 控制与决策, 2023, 38 (1): 152- 160.
URL
|
|
DENG W B, LI Y B, ZHANG Y M, et al. A generative text summarization method combining BERT and convolution gating. Control and Decision, 2023, 38 (1): 152- 160.
URL
|
21 |
ZHAO M H, ZHONG S S, FU X Y, et al. Deep residual shrinkage networks for fault diagnosis. IEEE Transactions on Industrial Informatics, 2020, 16 (7): 4681- 4690.
doi: 10.1109/TII.2019.2943898
|
22 |
HU J E, SHEN L, SUN G. Squeeze-and-excitation networks[C]//Proceedings of 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington D. C., USA: IEEE Press, 2018: 7132-7141.
|
23 |
田珂珂, 周瑞莹, 董浩业, 等. 基于编码器共享和门控网络的生成式文本摘要方法. 北京大学学报(自然科学版), 2020, 56 (1): 61- 67.
URL
|
|
TIAN K K, ZHOU R Y, DONG H Y, et al. An abstractive summarization method based on encoder-sharing and gated network. Acta Scientiarum Naturalium Universitatis Pekinensis, 2020, 56 (1): 61- 67.
URL
|
24 |
HE R N, RAVULA A, KANAGAL B, et al. RealFormer: transformer likes residual attention[C]//Proceedings of the 52th Annual Meeting of the Association for Computational Linguistics. Washington D. C., USA: Association for Computational Linguistics, 2021: 929-943.
|
25 |
HU B T, CHEN Q C, ZHU F Z. LCSTS: a large scale Chinese short text summarization dataset[C]//Proceedings of 2015 Conference on Empirical Methods in Natural Language Processing. Washington D. C., USA: Association for Computational Linguistics, 2015: 1967-1972.
|