[1] 邓依依, 邬昌兴, 魏永丰, 等.基于深度学习的命名实体识别综述[J].中文信息学报, 2021, 35(9):30-45. DENG Y Y, WU C X, WEI Y F, et al.A survey on named entity recognition based on deep learning[J].Journal of Chinese Information Processing, 2021, 35(9):30-45.(in Chinese) [2] LI J, SUN A, HAN J L, et al.A survey on deep learning for named entity recognition[J].IEEE Transactions on Knowledge and Data Engineering, 2020, 34(1):50-70. [3] 宋旭晖, 于洪涛, 李邵梅.基于图注意力网络字词融合的中文命名实体识别[J].计算机工程, 2022, 48(10):298-305. SONG X H, YU H T, LI S M.Chinese named entity recognition based on word fusion of graph attention network[J].Computer Engineering, 2022, 48(10):298-305.(in Chinese) [4] 杨飘, 董文永.基于BERT嵌入的中文命名实体识别方法[J].计算机工程, 2020, 46(4):40-45, 52. YANG P, DONG W Y.Chinese named entity recognition method based on BERT embedding[J].Computer Engineering, 2020, 46(4):40-45, 52.(in Chinese) [5] DEVLIN J, CHANG M W, LEE K, et al.BERT:pre-training of deep bidirectional Transformers for language understanding[EB/OL].[2022-03-08].https://arxiv.org/pdf/1810.04805.pdf. [6] 张云秋, 汪洋, 李博诚.基于RoBERTa-wwm动态融合模型的中文电子病历命名实体识别[J].数据分析与知识发现, 2022, 6(S1):242-250. ZHANG Y Q, WANG Y, LI B C.Identifying named entities of Chinese electronic medical records based on RoBERTa-wwm dynamic fusion model[J].Data Analysis and Knowledge Discovery, 2022, 6(S1):242-250.(in Chinese) [7] 胡新棒, 于溆乔, 李邵梅, 等.基于知识增强的中文命名实体识别[J].计算机工程, 2021, 47(11):84-92. HU X B, YU X Q, LI S M, et al.Chinese named entity recognition based on knowledge enhancement[J].Computer Engineering, 2021, 47(11):84-92.(in Chinese) [8] 史占堂, 马玉鹏, 赵凡, 等.基于CNN-Head Transformer编码器的中文命名实体识别[J].计算机工程, 2022, 48(10):73-80. SHI Z T, MA Y P, ZHAO F, et al.Chinese named entity recognition based on CNN-Head Transformer encoder[J].Computer Engineering, 2022, 48(10):73-80.(in Chinese) [9] VASWANI A, SHAZEER N, PARMAR N, et al.Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.New York, USA:ACM Press, 2017:6000-6010. [10] SU J S, TAN Z X, XIONG D Y, et al. Lattice-based recurrent neural network encoders for neural machine translation[C]//Proceedings of the 31st AAAI Conference on Artificial Intelligence.New York, USA:ACM Press, 2017:3302-3308. [11] REI M, CRICHTON G K O, PYYSALO S.Attending to characters in neural sequence labeling models[EB/OL].[2022-03-08].https://arxiv.org/pdf/1611.04361.pdf. [12] CUI L Y, ZHANG Y.Hierarchically-refined label attention network for sequence labeling[EB/OL].[2022-03-08].https://arxiv.org/abs/1908.08676v1. [13] COLLOBERT R, WESTON J, BOTTOU L, et al.Natural language processing (almost) from scratch[EB/OL].[2022-03-08].https://arxiv.org/pdf/1103.0398v1.pdf. [14] HUANG Z H, XU W, YU K.Bidirectional LSTM-CRF models for sequence tagging[EB/OL].[2022-03-08].https://arxiv.org/pdf/1508.01991.pdf. [15] STRUBELL E, VERGA P, BELANGER D, et al.Fast and accurate entity recognition with iterated dilated convolutions[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:Association for Computational Linguistics, 2017:2670-2680. [16] YAN H, DENG B C, LI X N, et al.TENER:adapting Transformer encoder for named entity recognition[EB/OL].[2022-03-08].https://arxiv.org/abs/1911.04474v3. [17] GUI T, ZOU Y, ZHANG Q, et al.A lexicon-based graph neural network for Chinese NER[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing.[S.l.]:IEEE Press, 2019:1040-1050. [18] SUI D B, CHEN Y B, LIU K, et al.Leverage lexical knowledge for Chinese named entity recognition via collaborative graph network[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing.[S.l.]:IEEE Press, 2019:3828-3838. [19] ZHANG Y, YANG J.Chinese NER using lattice LSTM[EB/OL].[2022-03-08].https://arxiv.org/pdf/1805.02023.pdf. [20] LI X N, YAN H, QIU X P, et al.FLAT:Chinese NER using flat-lattice Transformer[EB/OL].[2022-03-08].https://arxiv.org/abs/2004.11795. [21] PENG M L, MA R, ZHANG Q, et al.Simplify the usage of lexicon in Chinese NER[EB/OL].[2022-03-08].https://arxiv.org/abs/1908.05969v1. [22] WU S, SONG X N, FENG Z H.MECT:multi-metadata embedding based cross-Transformer for Chinese named entity recognition[EB/OL].[2022-03-08].https://arxiv.org/abs/2107.05418v1. [23] MIKOLOV T, SUTSKEVER I, CHEN K, et al.Distributed representations of words and phrases and their compositionality[C]//Proceedings of the 26th International Conference on Neural Information Processing Systems.New York, USA:ACM Press, 2013:3111-3119. [24] LIU Y H, OTT M, GOYAL N, et al.RoBERTa:a robustly optimized BERT pretraining approach[EB/OL].[2022-03-08].https://arxiv.org/abs/1907.11692v1. [25] CUI Y M, CHE W X, LIU T, et al.Pre-training with whole word masking for Chinese BERT[J].IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2021, 29:3504-3514. [26] ZHU Y Y, WANG G X, KARLSSON B F.CAN-NER:convolutional attention network for Chinese named entity recognition[EB/OL].[2022-03-08].https://arxiv.org/pdf/1904.02141.pdf. |