[1] MIWA M, BANSAL M. End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures[C]//. 2016:1105-1116.
[2] KOO T, COLLINS M. Efficient third-order dependency parsers[C]//: Proceedings of the 48th Annual Meeting of the Association
for Computational Linguistics. 2010:1-11.
[3] DIEFENBACH D, LOPEZ V, SINGH K, et al. Core techniques of question answering systems over knowledge bases: a
survey[J]. Knowledge and Information Systems, 2018,55(3):529-569.
[4] BORTHWICK A. A maximum entropy approach to named entity recognition[Z]. Citeseer, 1999.
[5] BIKEL D M., SCHWARTZ R, WEISCHEDEL R M. An Algorithm that Learns What's in a Name[J]. Machine Learni
ng, 1999,34(1):211-231.DOI:10.1023/A:1007558221122.
[6] BIKEL D M., MILLER S, SCHWARTZ R, et al. Nymble: a High-Performance Learning Name-finder[C]//: Fifth Conference on
Applied Natural Language Processing. 1997:194-201.
[7] MCCALLUM A, LI Wei. Early results for Named Entity Recognition with Conditional Random Fields, Feature Induction and
Web-Enhanced Lexicons[C]//: Proceedings of the Seventh Conference on Natural Language Learning at HLT-NAACL 2003.
2003:188-191.
[8] LI Xiaoya, MENG Yuxian, SUN Xiaofei, et al. Is Word Segmentation Necessary for Deep Learning of Chinese
Representations?[C]//: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.
2019:3242-3252.
[9] 张若彬, 刘嘉勇, 何祥. 基于BLSTM-CRF模型的安全漏洞领域命名实体识别[J]. 四川大学学报(自然科学版),
2019,56(03):469-475.
ZHANG Ruobin, LIU Jiayong, HE Xiang. Named entity recognition for vulnerabilities based on BLSTM-CRF model[J]. Journal
of Sichuan University (Natural Science Edition), 2019,56(03):469-475.
[10] ZHANG Yue, YANG Jie. Chinese NER Using Lattice LSTM[C]//: Proceedings of the 56th Annual Meeting of the Association
for Computational Linguistics. 2018:1554-1564.
[11] SUI Dianbo, CHEN Yubo, LIU Kang, et al. Leverage lexical knowledge for chinese named entity recognition via collaborative
graph network[C]//: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th
International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019:3821-3831.
[12] GUI Tao, ZOU Yicheng, ZHANG Qi, et al. A lexicon-based graph neural network for chinese ner[C]//: Proceedings of the 2019
Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural
Language Processing (EMNLP-IJCNLP). 2019:1039-1049.
[13] DING Ruixue, XIE Pengjun, ZHANG Xiaoyan, et al. A neural multi-digraph model for Chinese NER with gazetteers[C]//:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019:1462-1467.
[14] MA Ruotian, PENG Minlong, ZHANG Qi, et al. Simplify the usage of lexicon in Chinese NER[C]//: Proceedings of the 58th
Annual Meeting of the Association for Computational Linguistics. 2020:5951-5960.
[15] GUI Tao, MA Ruotian, ZHANG Qi, et al. CNN-Based Chinese NER with Lexicon Rethinking[C]//: Twenty-Eighth International
Joint Conference on Artificial Intelligence IJCAI-19. 2019
[16] LI Xiaonan, YAN Hang, QIU Xipeng, et al. FLAT: Chinese NER Using Flat-Lattice Transformer[C]//: Proceedings of the 58th
Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, 2020
[17] XUE Mengge, YU Bowen, LIU Tingwen, et al. Porous Lattice Transformer Encoder for Chinese NER[C]//: Proceedings of the
28th International Conference on Computational Linguistics. 2020:3831-3841.
[18] MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space[J]. arXiv preprint
arXiv:1301.3781, 2013.
[19] YAN Hang, DENG Bocao, LI Xiaonan, et al. TENER: Adapting Transformer Encoder for Named Entity Recognition[J]. arXiv,
2019:1911.
[20] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//: Advances in neural information processing
systems. 2017:5998-6008. [21] LEVOW G. The third international Chinese language processing bakeoff: Word segmentation and named entity recognition[C]//:
Proceedings of the Fifth SIGHAN Workshop on Chinese Language Processing. 2006:108-117.
[22] HE Hangfeng, SUN Xu. F-Score Driven Max Margin Neural Network for Named Entity Recognition in Chinese Social
Media[C]//: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics.
2017:713-718.
[23] PENG Nanyun, DREDZE Mark. Named entity recognition for chinese social media with jointly trained embeddings[C]//:
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015:548-554.
[24] WEISCHEDEL R, PALMER M, MARCUS Mitchell, et al. Ontonotes release 5.0 ldc2013t19[Z]. 2013: 23.
[25] YU Shiwen, DUAN Huiming, WU Yunfang. Corpus of Multi-level Processing for Modern Chinese[Z]. V1 ed. Peking University
Open Research Data Platform, 2018.
[26] MIKOLOV T, SUTSKEVER I, CHEN Kai, et al. Distributed representations of words and phrases and their
compositionality[C]//: Advances in neural information processing systems. 2013:3111-3119.
[27] KINGMA D P., BA J. Adam: A Method for Stochastic Optimization[J]. arXiv preprint arXiv:1412.6980, 2014.
[28] SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout: a simple way to prevent neural networks from overfitting[J].
The journal of machine learning research, 2014,15(1):1929-1958.
|