[1] MIWA M,BANSAL M.End-to-end relation extraction using LSTMs on sequences and tree structures[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2016:1105-1116. [2] KOO T,COLLINS M.Efficient third-order dependency parsers[C]//Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2010:1-11. [3] DIEFENBACH D,LOPEZ V,SINGH K,et al.Core techniques of question answering systems over knowledge bases:a survey[J].Knowledge and Information Systems,2018,55(3):529-569. [4] BORTHWICK A.A maximum entropy approach to named entity[EB/OL].[2020-09-11].https://www.researchgate.net/publication/2937260_A_Maximum_Entropy_Approach_To_Named_Entity. [5] BIKEL D M,SCHWARTZ R,WEISCHEDEL R M.An algorithm that learns what's in a name[J].Machine Learning,1999,34(1/2/3):211-231. [6] BIKEL D M,MILLER S,SCHWARTZ R,et al.Nymble:a high-performance learning name-finder[C]//Proceedings of the 5th Conference on Applied Natural Language Processing.Philadelphia,USA:Association for Computational Linguistics,1997:194-201. [7] MCCALLUM A,LI W.Early results for named entity recognition with conditional random fields,feature induction and Web-enhanced lexicons[C]//Proceedings of the 7th Conference on Natural Language Learning.Philadelphia,USA:Association for Computational Linguistics,2003:188-191. [8] 温秀秀,马超,高原原,等.基于标签聚类的中文重叠命名实体识别方法[J].计算机工程,2020,46(5):41-46. WEN X X,MA C,GAO Y Y,et al.Chinese overlapping named entity recognition method based on label clustering[J].Computer Engineering,2020,46(5):41-46.(in Chinese) [9] 张若彬,刘嘉勇,何祥.基于BLSTM-CRF模型的安全漏洞领域命名实体识别[J].四川大学学报(自然科学版),2019,56(3):469-475. ZHANG R B,LIU J Y,HE X.Named entity recognition for vulnerabilities based on BLSTM-CRF model[J].Journal of Sichuan University (Natural Science Edition),2019,56(3):469-475.(in Chinese) [10] ZHANG Y,YANG J.Chinese NER using Lattice LSTM[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2018:1554-1564. [11] SUI D B,CHEN Y B,LIU K,et al.Leverage lexical knowledge for Chinese named entity recognition via collaborative graph network[C]//Proceedings of 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing.Philadelphia,USA:Association for Computational Linguistics,2019:3821-3831. [12] GUI T,ZOU Y C,ZHANG Q,et al.A lexicon-based graph neural network for Chinese NER[C]//Proceedings of 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing.Philadelphia,USA:Association for Computational Linguistics,2019:1039-1049. [13] DING R X,XIE P J,ZHANG X Y,et al.A neural multi-digraph model for Chinese NER with gazetteers[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2019:1462-1467. [14] MA R T,PENG M L,ZHANG Q,et al.Simplify the usage of Lexicon in Chinese NER[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2020:5951-5960. [15] GUI T,MA R T,ZHANG Q,et al.CNN-based Chinese NER with lexicon rethinking[EB/OL].[2020-09-11].https://www.researchgate.net/publication/334844205_CNN-Based_Chinese_NER_with_Lexicon_Rethinking. [16] LI X N,YAN H,QIU X P,et al.FLAT:Chinese NER using flat-lattice transformer[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2020:1-8. [17] XUE M G,YU B W,LIU T W,et al.Porous lattice transformer encoder for Chinese NER[C]//Proceedings of the 28th International Conference on Computational Linguistics.Stroudsburg,USA:International Committee on Computational Linguistics,2020:3831-3841. [18] MIKOLOV T,CHEN K,CORRADO G,et al.Efficient estimation of word representations in vector space[EB/OL].[2020-09-11].https://arxiv.org/abs/1301.3781v1. [19] YAN H,DENG B C,LI X N,et al.TENER:adapting transformer encoder for named entity recognition[EB/OL].[2020-09-11].https://arxiv.org/abs/1911.04474. [20] VASWANI A,SHAZEER N,PARMAR N,et al.Attention is all you need[C]//Proceedings of 2017 International Conference on Neural Information Processing Systems.Cambridge,USA:MIT Press,2017:5998-6008. [21] LEVOW G.The third international Chinese language processing bakeoff:word segmentation and named entity recognition[C]//Proceedings of the 5th SIGHAN Workshop on Chinese Language Processing.Washington D.C.,USA:IEEE Press,2006:108-117. [22] HE H F,SUN X.F-score driven max margin neural network for named entity recognition in Chinese social media[C]//Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2017:713-718. [23] PENG N Y,DREDZE M.Named entity recognition for Chinese social media with jointly trained embeddings[C]//Proceedings of 2015 Conference on Empirical Methods in Natural Language Processing.Philadelphia,USA:Association for Computational Linguistics,2015:548-554. [24] WEISCHEDEL R,PALMER M,MARCUS M,et al.Ontonotes release 5.0[EB/OL].[2020-09-11].https://catalog.ldc.upenn.edu/LDC2013T19. [25] YU S W,DUAN H M,WU Y F.Corpus of multi-level processing for modern Chinese[EB/OL].[2020-09-11].https://opendata.pku.edu.cn/dataset.xhtml?persistentId=doi:10.18170/DVN/SEYRX5. [26] MIKOLOV T,SUTSKEVER I,CHEN K,et al.Distributed representations of words and phrases and their compositionality[C]//Proceedings of 2013 International Conference on Advances in Neural Information Processing Systems.Cambridge,USA:MIT Press,2017:2013:3111-3119. [27] HAN Z D.Dyna:a method of momentum for stochastic optimization[EB/OL].[2020-09-11].https://arxiv.org/abs/1805.04933. [28] SRIVASTAVA N,HINTON G,KRIZHEVSKY A,et al.Dropout:a simple way to prevent neural networks from overfitting[J].The Journal of Machine Learning Research,2014,15(1):1929-1958. |