[1] SHEN Y, HUANG X J. Attention-based convolutional neural network for semantic relation extraction[C]//Proceedings of the 26th International Conference on Computational Linguistics. Washington D. C.,USA:IEEE Press, 2016:2526-2536. [2] QIN P D, XU W R, GUO J. Designing an adaptive attention mechanism for relation classification[C]//Proceedings of International Joint Conference on Neural Networks. Anchorage, USA:IEEE Press, 2017:4356-4362. [3] XU Y, MOU L L, LI G, et al. Classifying relations via long short term memory networks along shortest dependency paths[C]//Proceedings of 2015 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA:Association for Computational Linguistics, 2015:1785-1794. [4] MIWA M, BANSAL M. End-to-end relation extraction using LSTMs on sequences and tree structures[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2016:1105-1116. [5] SUI D, CHEN Y, LIU K, et al. Joint entity and relation extraction with set prediction networks[EB/OL].[2023-08-30].https://arxiv.org/abs/2011.01675v1. [6] BROWN T, MANN B, RYDER N, et al. Language models are few-shot learners[C]//Proceedings of NIPS'20. Cambridge, USA:MIT Press, 2020:1877-1901. [7] HUANG Q, SUN Y B, XING Z C, et al. API entity and relation joint extraction from text via dynamic prompt-tuned language model[EB/OL].[2023-08-30]. https://arxiv.org/abs/2301.03987. [8] WADHWA S, AMIR S, WALLACE B C. Revisiting relation extraction in the era of large language models[EB/OL].[2023-08-30]. https://arxiv.org/abs/2305.05003. [9] RIEDEL S, YAO L M, MCCALLUM A. Modeling relations and their mentions without labeled text[M]. Berlin, Germany:Springer, 2010. [10] GARDENT C, SHIMORINA A, NARAYAN S, et al. Creating training corpora for NLG micro-planning[EB/OL].[2023-08-30].https://hal.science/hal-01623744v1. [11] SUTSKEVER I, VINYALS O, LE Q V. Sequence to sequence learning with neural networks[C]//Proceedings of NIPS'14. Cambridge, USA:MIT Press, 2014:3104-3112. [12] ZHENG S C, WANG F, BAO H Y, et al. Joint extraction of entities and relations based on a novel tagging scheme[EB/OL].[2023-08-30].https://arxiv.org/abs/1706.05075v1. [13] ZENG X R, ZENG D J, HE S Z, et al. Extracting relational facts by an end-to-end neural model with copy mechanism[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2018:506-514. [14] FEI H, REN Y F, JI D H. Boundaries and edges rethinking:an end-to-end neural model for overlapping entity relation extraction[J]. Information Processing & Management, 2020, 57(6):102311. [15] DUAN G D, MIAO J Y, HUANG T X, et al. A relational adaptive neural model for joint entity and relation extraction[J]. Frontiers in Neurorobotics, 2021, 15:635492. [16] LI C, TIAN Y. Downstream model design of pre-trained language model for relation extraction task[EB/OL].[2023-08-30]. https://arxiv.org/pdf/2004.03786v1.pdf. [17] DEVLIN J, CHANG M W, LEE K, et al. BERT:pre-training of deep bidirectional transformers for language understanding[EB/OL].[2023-08-30]. https://arxiv.org/abs/1810.04805v2. [18] LIU L P, WANG M L, HE X H, et al. Extracting relational facts based on hybrid Syntax-Guided transformer and pointer network[J]. Journal of Intelligent & Fuzzy Systems, 2021, 40(6):12167-12183. [19] YE H B, ZHANG N Y, DENG S M, et al. Contrastive triple extraction with generative transformer[C]//Proceedings of AAAI Conference on Artificial Intelligence.[S. 1.]:AAAI Press, 2021:14257-14265. [20] HANG T T, FENG J, WU Y R, et al. Joint extraction of entities and overlapping relations using source-target entity labeling[J]. Expert Systems with Applications, 2021, 177:114853. [21] TOUVRON H, MARTIN L, STONE K, et al. Llama 2:open foundation and fine-tuned chat models[EB/OL].[2023-08-30]. https://arxiv.org/abs/2307.09288. [22] MANNING C, SURDEANU M, BAUER J, et al. The stanford CoreNLP natural language processing toolkit[C]//Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2014:55-60. [23] ZHANG L T, ZHANG L, SHI S, et al. LoRA-FA:memory-efficient low-rank adaptation for large language models fine-tuning[EB/OL].[2023-08-30]. https://arxiv.org/abs/2308.03303. [24] RAFFEL C, SHAZEER N M, ROBERTS A, et al. Exploring the limits of transfer learning with a unified text-to-text transformer[J]. The Journal of Machine Learning Research, 2020, 21(1):5485-5551. [25] CONNEAU A, KHANDELWAL K, GOYAL N, et al. Unsupervised cross-lingual representation learning at scale[EB/OL].[2023-08-30]. https://arxiv.org/abs/1911.02116v2. [26] LAN Z Z, CHEN M D, GOODMAN S, et al. ALBERT:a lite BERT for self-supervised learning of language representations[EB/OL].[2023-08-30]. https://arxiv.org/abs/1909.11942v6. [27] BIRD S. NLTK-Lite:efficient scripting for natural language processing[C]//Proceedings of the 4th International Conference on Natural Language Processing.[S. 1.]:Allied Publishers Private Limited, 2005:11-18. [28] VASILIEV Y. Natural language processing with Python and spaCy:a practical introduction[M]. San Francisco, USA:No Starch Press, 2020. |