[1] WANG L L, CAO Z, DE MELO G, et al.Relation classification via multi-level attention CNNs[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.Stroudsburg, USA:Association for Computational Linguistics, 2016:1298-1307. [2] LIU X, LUO Z C, HUANG H Y.Jointly multiple events extraction via attention-based graph information aggregation[EB/OL].[2021-07-11].https://arxiv.org/abs/1809.09078. [3] KOLITSAS N, GANEA O E, HOFMANN T.End-to-end neural entity linking[EB/OL].[2021-07-11].https://arxiv.org/abs/1808.07699. [4] GERS F A, SCHMIDHUBER J, CUMMINS F.Learning to forget:continual prediction with LSTM[J].Neural Computation, 2000, 12(10):2451-2471. [5] ZHANG Y, YANG J.Chinese NER using lattice LSTM[EB/OL].[2021-07-11].https://arxiv.org/abs/1805.02023. [6] VASWANI A, SHAZEER N, PARMAR N, et al.Attention is all you need[EB/OL].[2021-07-11].https://arxiv.org/abs/1706.03762. [7] YAN H, DENG B C, LI X N, et al.TENER:adapting Transformer encoder for named entity recognition[EB/OL].[2021-07-11].https://arxiv.org/abs/1911.04474. [8] LECUN Y, BOSER B, DENKER J S, et al.Backpropagation applied to handwritten Zip code recognition[J].Neural Computation, 1989, 1(4):541-551. [9] HUANG Z H, XU W, YU K.Bidirectional LSTM-CRF models for sequence tagging[EB/OL].[2021-07-11].https://arxiv.org/abs/1508.01991. [10] LAFFERTY J, MCCALLUM A, PEREIRA F.Conditional random fields:probabilistic models for segmenting and labeling sequence data[C]//Proceedings of the 18th International Conference on Machine Learning.New York, USA:ACM Press, 2001:282-289. [11] PENG N Y, DREDZE M.Named entity recognition for Chinese social media with jointly trained embeddings[C]//Proceedings of 2015 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:Association for Computational Linguistics, 2015:548-554. [12] MA X Z, HOVY E.End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF[EB/OL].[2021-07-11].https://arxiv.org/abs/1603.01354. [13] STRUBELL E, VERGA P, BELANGER D, et al.Fast and accurate entity recognition with iterated dilated convolutions[C]//Proceedings of 2017 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:Association for Computational Linguistics, 2017:1-11. [14] LIU Y J, ZHANG Y, CHE W X, et al.Domain adaptation for CRF-based Chinese word segmentation using free annotations[C]//Proceedings of 2014 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:Association for Computational Linguistics, 2014:864-874. [15] CAO P F, CHEN Y B, LIU K, et al.Adversarial transfer learning for Chinese named entity recognition with self-attention mechanism[C]//Proceedings of 2018 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:Association for Computational Linguistics, 2018:182-192. [16] GUI T, MA R T, ZHANG Q, et al.CNN-based Chinese NER with lexicon rethinking[C]//Proceedings of the 28th International Joint Conference on Artificial Intelligence.New York, USA:ACM Press, 2019:4982-4988. [17] MENG Y X, WU W, WANG F, et al.Glyce:glyph-vectors for Chinese character representations[EB/OL].[2021-07-11].https://arxiv.org/abs/1901.10125. [18] XUAN Z Y, BAO R, JIANG S Y.FGN:fusion glyph network for Chinese named entity recognition[M].Berlin, Germany:Springer, 2021. [19] 张栋, 王铭涛, 陈文亮.结合五笔字形与上下文相关字向量的命名实体识别[J].计算机工程, 2021, 47(3):94-101. ZHANG D, WANG M T, CHEN W L.Named entity recognition combining Wubi glyphs with contextualized character embeddings[J].Computer Engineering, 2021, 47(3):94-101.(in Chinese) [20] 司逸晨, 管有庆.基于Transformer编码器的中文命名实体识别模型[J].计算机工程, 2022, 48(7):66-72. SI Y C, GUAN Y Q.Chinese named entity recognition model based on Transformer encoder[J].Computer Engineering, 2022, 48(7):66-72.(in Chinese) [21] XUE M G, YU B W, LIU T W, et al.Porous Lattice Transformer encoder for Chinese NER[C]//Proceedings of the 28th International Conference on Computational Linguistics.Stroudsburg, USA:International Committee on Computational Linguistics, 2020:3831-3841. [22] LI X N, YAN H, QIU X P, et al.FLAT:Chinese NER using FLAT-lattice Transformer[EB/OL].[2021-07-11].https://arxiv.org/abs/2004.11795. [23] HE H F, SUN X.F-score driven max margin neural network for named entity recognition in Chinese social media[EB/OL].[2021-07-11].https://arxiv.org/abs/1611.04234. [24] ZHU Y Y, WANG G X, KARLSSON B F.CAN-NER:convolutional attention network for Chinese named entity recognition[EB/OL].[2021-07-11].https://arxiv.org/abs/1904.02141. [25] CUI Y M, CHE W X, LIU T, et al.Pre-training with whole word masking for Chinese BERT[EB/OL].[2021-07-11].https://arxiv.org/abs/1906.08101. [26] DEVLIN J, CHANG M W, LEE K, et al.BERT:pre-training of deep bidirectional Transformers for language understanding[EB/OL].[2021-07-11].https://arxiv.org/abs/1810.04805. |