[1] LI J,SUN A X,HAN J L,et al.A survey on deep learning for named entity recognition[J].IEEE Transactions on Knowledge and Data Engineering,2022,34(1):50-70. [2] DAI Z H,YANG Z L,YANG Y M,et al.Transformer-XL:attentive language models beyond a fixed-length context[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.Stroudsburg,USA:Association for Computational Linguistics,2019:2978-2989. [3] LIN Y,LIU L Y,JI H,et al.Reliability-aware dynamic feature composition for name tagging[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.Stroudsburg,USA:Association for Computational Linguistics,2019:165-174. [4] RONRAN C,LEE S.Effect of character and word features in bidirectional LSTM-CRF for NER[C]//Proceedings of IEEE International Conference on Big Data and Smart Computing.Washington D.C.,USA:IEEE Press,2020:613-616. [5] JIE Z M,LU W.Dependency-guided LSTM-CRF for named entity recognition[EB/OL].[2022-04-01].https://arxiv.org/abs/1909.10148. [6] YAN C,SU Q,WANG J.MoGCN:mixture of gated convolutional neural network for named entity recognition of Chinese historical texts[J].IEEE Access,2020,8:181629-181639. [7] LEE L H,LU Y.Multiple embeddings enhanced multi-graph neural networks for Chinese healthcare named entity recognition[J].IEEE Journal of Biomedical and Health Informatics,2021,25(7):2801-2810. [8] ZHAI F F,POTDAR S,XIANG B,et al.Neural models for sequence chunking[J].Proceedings of the AAAI Conference on Artificial Intelligence,2017,31(1):3365-3371. [9] SABOUR S,FROSST N,HINTON G E.Dynamic routing between capsules[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.New York,USA:ACM Press,2017:3859-3869. [10] SHEN Y Y,YUN H,LIPTON Z,et al.Deep active learning for named entity recognition[C]//Proceedings of the 2nd Workshop on Representation Learning for NLP.Stroudsburg,USA:Association for Computational Linguistics,2017:252-256. [11] BROWN T B,MANN B,RYDER N,et al.Language models are few-shot learners[C]//Proceedings of the 34th International Conference on Neural Information Processing Systems.New York,USA:ACM Press,2020:1877-1901. [12] ZHANG D,WEI S Z,LI S S,et al.Multi-modal graph fusion for named entity recognition with targeted visual guidance[C]//Proceedings of the AAAI Conference on Artificial Intelligence.Palo Alto,USA:AAAI,2021:14347-14355. [13] QIU J H,WANG Q,ZHOU Y M,et al.Fast and accurate recognition of Chinese clinical named entities with residual dilated convolutions[C]//Proceedings of IEEE International Conference on Bioinformatics and Biomedicine.Washington D.C.,USA:IEEE Press,2019:935-942. [14] 晏阳天,赵新宇,吴贤.基于BERT与字形字音特征的医疗命名实体识别[EB/OL].[2022-03-01].https://bj.bcebos.com/v1/conference/ccks2020/eval_paper/ccks2020_eval_paper_3_1_2.pdf.YAN Y T,ZHAO X Y,WU X.Medical named entity recognition based on BERT and glyph and phonetic features[EB/OL].[2022-03-01].https://bj.bcebos.com/v1/conference/ccks2020/eval_paper/ccks2020_eval_paper_3_1_2.pdf.(in Chinese) [15] 罗凌,杨志豪,宋雅文,等.基于笔画ELMo和多任务学习的中文电子病历命名实体识别研究[J].计算机学报,2020,43(10):1943-1957.LUO L,YANG Z H,SONG Y W,et al.Chinese clinical named entity recognition based on stroke ELMo and multi-task learning[J].Chinese Journal of Computers,2020,43(10):1943-1957.(in Chinese) [16] ZHOU B H,CAI X R,ZHANG Y,et al.MTAAL:multi-task adversarial active learning for medical named entity recognition and normalization[C]//Proceedings of the AAAI Conference on Artificial Intelligence.Palo Alto,USA:AAAI,2021:14586-14593. [17] SONG Y W,YANG Z H,LUO L,et al.Biomedical mutation entity recognition method based on character convolution neural network[J].Journal of Chinese Information Processing,2021,35(5):63-69. [18] LI X N,YAN H,QIU X P,et al.FLAT:Chinese NER using flat-lattice transformer[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.Stroudsburg,USA:Association for Computational Linguistics,2020:1-10. [19] DEVLIN J,CHANG M W,LEE K,et al.BERT:pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of NAACL-HLT.Minneapolis,USA:[s.n.],2019:4171-4186. [20] CUI Y M,CHE W X,LIU T,et al.Pre-training with whole word masking for Chinese BERT[J].IEEE/ACM Transactions on Audio,Speech,and Language Processing,2021,29:3504-3514. [21] CUI Y,CHE W,LIU T,et al.Revisiting pre-trained models for Chinese natural language processing[C]//Proceedings of EMNLP2020.[S.l.]:SIGDAT,2020:657-668. [22] LAN Z Z,CHEN M D,GOODMAN S,et al.ALBERT:a lite BERT for self-supervised learning of language representations[EB/OL].[2022-04-01].https://arxiv.org/abs/1909.11942. [23] LI N,LUO L,DING Z,et al.DUTIR at the CCKS-2019Task1:improving Chinese clinical named entity recognition using stroke ELMo and transfer learning[C]//Proceedings of CCKS2019.Hangzhou,China:Springer,2019:24-27. [24] LIU M,ZHOU X,CAO Z,et al.Team MSIIP at CCKS 2019 Task 1[C]//Proceedings of CCKS2019.Hangzhou,China:Springer,2019:1-11. [25] 杨文明,毕金良,邹佳丽,等.基于ChiEHRBert与多模型融合的医疗命名实体识别[EB/OL].[2022-03-01]. https://bj.bcebos.com/v1/conference/ccks2020/eval_paper/ccks2020_eval_paper_3_1_3.pdf.YANG W M,BI J L,ZOU J L,et al.Medical named entity recognition based on ChiEHRBert and multi-model fusion[EB/OL].[2022-03-01].https://bj.bcebos.com/v1/conference/ccks2020/eval_paper/ccks2020_eval_paper_3_1_3.pdf.(in Chinese) |