[1] WU Shanchan,HE Yifan.Enriching pre-trained language model with entity information for relation classification[C]//Proceedings of the 28th ACM International Conference on Information and Knowledge Management.New York,USA:ACM Press,2019:2361-2364. [2] ISOZAKI H,KAZAWA H.Efficient support vector classifiers for named entity recognition[C]//Proceedings of the 19th International Conference on Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2002:1-7. [3] BIKEL D M,MILLER S,SCHWARTZ R,et al.Nymble:a high-performance learning name-finder[C]//Proceedings of the 5th Conference on Applied Natural Language Processing.Philadelphia,USA:Association for Computational Linguistics,1997:194-201. [4] LAFFERTY J D,MCCALLUM A,PEREIRA F C.Conditional random fields:probabilistic models for segmenting and labeling sequence data[C]//Proceedings of the 18th International Conference on Machine Learning.San Mateo,USA:Morgan Kaufmann Publishers Inc.,2001:282-289. [5] LAMPLE G,BALLESTEROS M,SUBRAMANIAN S,et al.Neural architectures for named entity recognition[C]//Proceedings of 2016 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Philadelphia,USA:Association for Computational Linguistics,2016:260-270. [6] ZHU Yuying,WANG Guoxin.CAN-NER:convolutional attention network for Chinese named entity recognition[C]//Proceedings of 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Philadelphia,USA:Association for Computational Linguistics,2019:3384-3393. [7] ZHANG Yingcheng,YANG Yang,JIANG Rui,et al.Commercial intelligence entity recognition model based on BiLSTM-CRF[J].Computer Engineering,2019,45(5):308-314.(in Chinese)张应成,杨洋,蒋瑞,等.基于BiLSTM-CRF的商情实体识别模型[J].计算机工程,2019,45(5):308-314. [8] DOZIER C,KONDADADI R,LIGHT M,et al.Named entity recognition and resolution in legal text[M].Berlin,Germany:Springer,2010. [9] QUARESMA P,GONCALVES T.Using linguistic information and machine learning techniques to identify entities from juridical documents[M].Berlin,Germany:Springer,2010. [10] HAQ M I U,LI Q,HASSAN S.Text mining techniques to capture facts for cloud computing adoption and big data processing[J].IEEE Access,2019,7:162254-162267. [11] KAMBHATLA N.Combining lexical,syntactic,and semantic features with maximum entropy models for extracting relations[C]//Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2004:22-29. [12] ZHOU Guodong,SU Jie,ZHANG Jie,et al.Exploring various knowledge in relation extraction[C]//Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2005:427-434. [13] CULOTTA A,SORENSEN J.Dependency tree kernels for relation extraction[C]//Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2004:423-428. [14] ZHOU Guodong,ZHANG Min,JI Donghong,et al.Tree kernel-based relation extraction with context-sensitive structured parse tree information[C]//Proceedings of 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning.Prague,Czech Republic:[s.n.],2007:728-736. [15] ZENG Daojian,LIU Kang,LAI Siwei,et al.Relation classification via convolutional deep neural network[C]//Proceedings of the 25th International Conference on Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2014:2335-2344. [16] NGUYEN T H,GRISHMAN R.Relation extraction:perspective from convolutional neural networks[C]//Proceedings of the 1st Workshop on Vector Space Modeling for Natural Language Processing.Philadelphia,USA:Association for Computational Linguistics,2015:39-48. [17] XIANG Bing,ZHOU Bowen.Classifying relations by ranking with convolutional neural networks[C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing.Philadelphia,USA:Association for Computational Linguistics,2015:626-634. [18] ZHANG Runyan,MENG Fanrong,ZHOU Yong,et al.Relation classification via recurrent neural network with attention and tensor layers[J].Big Data Mining and Analytics,2018,1(3):234-244. [19] ZHOU Peng,SHI Wei,TIAN Jun,et al.Attention-based bidirectional long short-term memory networks for relation classification[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2016:207-212. [20] SUN Ziyang,GU Junzhong,YANG Jing.Chinese entity relation extraction method based on deep learning[J].Computer Engineering,2018,44(9):164-170.(in Chinese)孙紫阳,顾君忠,杨静.基于深度学习的中文实体关系抽取方法[J].计算机工程,2018,44(9):164-170. [21] VERGA P,STRUBELL E,MCCALLUM A.Simultaneously self-attending to all mentions for full-abstract biological relation extraction[C]//Proceedings of 2018 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Philadelphia,USA:Association for Computational Linguistics,2018:872-884. [22] WANG H,TAN M,YU M,et al.Extracting multiple-relations in one-pass with pre-trained transformers[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2019:1371-1377. [23] VASWANI A,SHAZEER N,PARMAR N,et al.Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.Long Beach,USA:Neural Information Processing Systems Foundation,Inc.,2017:6000-6010. [24] DEVLIN J,CHANG M W,LEE K,et al.BERT:pre-training of deep bidirectional Transformers for language understanding[C]//Proceedings of 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Philadelphia,USA:Association for Computational Linguistics,2019:4171-4186. [25] ALT C,HUBNER M,HENNIG L.Fine-tuning pre-trained transformer language models to distantly supervised relation extraction[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.Philadelphia,USA:Association for Computational Linguistics,2019:1388-1398. [26] PETERS M E,NEUMANN M,IYYER M,et al.Deep contextualized word representations[EB/OL].[2020-02-01].https://arxiv.org/abs/1802.05365. [27] YANG Zhilin,DAI Zihang,YANG Yiming,et al.XLNet:generalized autoregressive pretraining for language understanding[C]//Proceedings of Advances in Neural Information Processing Systems.Vancouver,Canada:Neural Information Processing Systems Foundation,Inc.,2019:5754-5764. |