[1] DEVLIN J,CHANG M W,LEE K,et al.BERT:pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of IEEE NAACL'19.Washington D.C.,USA:IEEE Press,2019:4171-4186. [2] KIM Y.Convolutional neural networks for sentence classification[C]//Proceedings of 2014 Conference on Empirical Methods in Natural Language Processing.Doha,Qatar:[s.n.],2014:1746-1751. [3] LAI Siwei,XU Liheng,LIU Kang,et al.Recurrent convolutional neural networks for text classification[C]//Proceedings of the 29th AAAI Conference on Artificial Intelligence.[S.1.]:AAAI Press,2015:2267-2273. [4] HINTON G,VINYALS O,DEAN J.Distilling the knowledge in a neural network[EB/OL].[2020-01-10].https://arxiv.org/abs/1503.02531. [5] YU Lantao,ZHANG Weinan,WANG Jun,et al.SeqGAN:sequence generative adversarial nets with policy gradient[C]//Proceedings of the 31st AAAI Conference on Artificial Intelligence.[S.1.]:AAAI Press,2017:2852-2858. [6] LI X,ROTH D.Learning question classifiers:the role of semantic information[J].Natural Language Engineering,2006,12(3):229-249. [7] HAFFNER P,TUR G,WRIGHT J H.Optimizing SVM for complex call classification[C]//Proceedings of IEEE International Conference on Acoustics,Speech,and Signal Processing.Washington D.C.,USA:IEEE Press,2003:210-226. [8] GENKIN A,LEWIS D D,MADIGAN D.Large-scale Bayesian logistic regression for text categorization[J].Technometrics,2007,49(3):291-304. [9] MCCALLUM A,NIGAM K.A comparison of event models for naive Bayesian text classification[C]//Proceedings of AAAI Conference on Learning for Text Categorization.[S.1.]:AAAI Press,1998:41-48. [10] SCHAPIRE R E,SINGER Y.BoosTexter:a boosting-based system for text categorization[J].Machine Learning,2000,39(2/3):135-168. [11] YANG Y,PEDERSEN J O.A comparative study on feature selection in text categorization[C]//Proceedings of the 14th International Conference on Machine Learning.Washington D.C.,USA:IEEE Press,1997:412-420. [12] BENGIO Y,DUCHARME R,VINCENT P,et al.A neural probabilistic language model[J].Journal of Machine Learning Research,2003,3:1137-1155. [13] ELMAN J L.Finding structure in time[J].Cognitive Science,1990,14(2):179-211. [14] HOCHREITER S,SCHMIDHUBER J.Long short-term memory[J].Neural Computation,1997,9(8):1735-1780. [15] CHO K,VAN MERRIENBOER B,GULCEHRE C,et al.Learning phrase representations using RNN encoder-decoder for statistical machine translation[EB/OL].[2020-01-10].https://arxiv.org/abs/1406.1078. [16] BAHDANAU D,CHO K,BENGIO Y.Neural machine translation by jointly learning to align and translate[EB/OL].[2020-01-10].https://arxiv.org/abs/1409.0473. [17] MIKOLOV T,SUTSKEVER I,CHEN K,et al.Distributed representations of words and phrases and their compositionality[C]//Proceedings of Conference on Advances in Neural Information Processing Systems.Washington D.C.,USA:IEEE Press,2013:3111-3119. [18] PETEERS M,NEUMANN M,IYYER M,et al.Deep contextualized word representations[C]//Proceedings of 2018 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Washington D.C.,USA:IEEE Press,2018:2227-2237. [19] VASWANI A,SHAZEER N,PARMAR N,et al.Attention is all you need[C]//Proceedings of Conference on Advances in Neural Information Processing Systems.Washington D.C.,USA:IEEE Press,2017:5998-6008. [20] RADFORD A,WU J,CHILD R,et al.Language models are unsupervised multitask learners[J].OpenAI Blog,2019,1(8):457-468. [21] SONG K T,TAN X,QIN T,et al.MASS:masked sequence to sequence pre-training for language generation[EB/OL].[2020-01-10].https://arxiv.org/abs/1905.02450. [22] ZHANG Zhengyan,HAN Xu,LIU Zhiyan,et al.ERNIE:enhanced language representation with informative entities[EB/OL].[2020-01-10].https://arxiv.org/abs/1905.07129. [23] LIU Xiaodong,HE Pengcheng,CHEN Weizhu,et al.Multi-task deep neural networks for natural language understanding[EB/OL].[2020-01-10].https://arxiv.org/abs/1901.11504. [24] GOODFELLOW I J,POUGET-ABADIE J,MIRZA M,et al.Generative adversarial nets[C]//Proceedings of Conference on Advances in Neural Information Processing Systems.Washington D.C.,USA:IEEE Press,2014:589-596. [25] YU Chang,OUYANG Yu,ZHANG Bo,et al.Power user intent text generation based on generative adversarial network[J].Information Technology and Network Security,2019,38(11):67-72.(in Chinese)俞畅,欧阳昱,张波,等.基于对抗式生成网络的电力用户意图文本生成[J].信息技术与网络安全,2019,38(11):67-72. |