[1] GOODFELLOW I J, JEAN P A, MEHDI M, et al.Generative adversarial nets[C]//Proceedings of the 27th International Conference on Neural Information Processing Systems.Montreal, Canada:NIPS, 2014:2672-2680. [2] GUO J X, LU S D, HAN C, et al.Long text generation via adversarial training with leaked information[EB/OL].[2020-10-02].https://arxiv.org/pdf/1709.08624.pdf. [3] ZHU Y, LU S, LEI Z, et al.Texygen:a benchmarking platform for text generation models[C]//Proceedings of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval.New York, USA:ACM Press, 2018:1097-1100. [4] YU L, ZHANG W, WANG J, et al.SeqGAN:sequence generative adversarial nets with policy gradient[EB/OL].[2020-10-02].https://arxiv.org/pdf/1609.05473.pdf. [5] CHE T, LI Y, ZHANG R, et al.Maximum-likelihood augmented discrete generative adversarial networks[EB/OL].[2020-10-02].https://arxiv.org/pdf/1702.07983.pdf. [6] SUTSKEVER H, VINYALS ORIOL, QUOC V L, et al.Sequence to sequence learning with neural networks[EB/OL].[2020-10-02].http://cs224d.stanford.edu/papers/seq2seq.pdf. [7] LI J, SONG Y, ZHANG H, et al.Generating classical Chinese poems via conditional variational autoencoder and adversarial training[C]//Proceedings of 2018 Conference on Empirical Methods in Natural Language Processing.Washington D.C., USA:IEEE Press, 2018:3890-3900. [8] YAO L, PENG N, WEISCHEDEL R, et al.Plan-and-write:towards better automatic storytelling[J].Proceedings of the AAAI Conference on Artificial Intelligence, 2019, 33:7378-7385. [9] ZENG W, ABUDUWEILI A, LI L, et al.Automatic generation of personalized comment based on user profile[C]//Proceedings of the 57th Conference of the Association for Computational Linguistics.Florence, Italy:Association for Computational Linguistics, 2019:229-235. [10] DU Z Y.GPT2-Chinese:tools for training GPT2 model in Chinese language[EB/OL].[2020-10-02].https://github.com/Morizeyao/GPT2-Chinese. [11] DEVLIN J, CHANG M W, LEE K, et al.BERT:pre-training of deep bidirectional transformers for language understanding[EB/OL].[2020-10-02].https://arxiv.org/pdf/1810.04805.pdf. [12] PETERS M, NEUMANN M, IYYER M, et al.Deep contextualized word representations[C]//Proceedings of 2018 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Florence, Italy:Association for Computational Linguistics, 2018:2227-2237. [13] RADFORD A.Improving language understanding by generative pre-training[EB/OL].[2020-10-02].https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf. [14] RADFORD A, JEFFREY W, REWON C, et al.Language models are unsupervised multitask learners[EB/OL].[2020-10-02].https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners. pdf. [15] TOM B, BROWN B M, NICK R, et al.Language models are few-shot learners[EB/OL].[2020-10-02].https://papers.nips.cc/paper/2020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf. [16] YANG Z L, DAI Z H, YANG Y M, et al.XLNet:generalized autoregressive pretraining for language understanding[EB/OL].[2020-10-02].https://www.cs.cmu.edu/~jgc/publication/XLNET.pdf. [17] 付博, 陈毅恒, 邵艳秋, 等.基于用户自然标注的微博文本的消费意图识别[J].中文信息学报, 2017, 31(4):208-215. FU B, CHEN Y H, SHAO Y Q, et al.Consumption intent recognition based on user natural annotation[J].Journal of Chinese Information Processing, 2017, 31(4):208-215.(in Chinese) [18] 贾云龙, 韩东红, 林海原, 等.面向微博用户的消费意图识别算法[J].北京大学学报(自然科学版), 2020, 56(1):68-74. JIA Y L, HAN D H, LIN H Y, et al.Consumption intent recognition algorithms for Weibo users[J].Acta Scientiarum Naturalium Universitatis Pekinensis, 2020, 56(1):68-74.(in Chinese) [19] 桂思思, 陆伟, 张晓娟.基于查询表达式特征的时态意图识别研究[J].数据分析与知识发现, 2019, 3(3):66-75. GUI S S, LU W, ZHANG X J.Temporal intent classification with query expression feature[J].Data Analysis and Knowledge Discovery, 2019, 3(3):66-75.(in Chinese) [20] 廖胜兰, 吉建民, 俞畅, 等.基于BERT模型与知识蒸馏的意图分类方法[J].计算机工程, 2021, 47(5):73-79. LIAO S L, JI J M, YU C, et al.Intention classification method based on BERT model and knowledge distillation[J].Computer Engineering, 2021, 47(5):73-79.(in Chinese) [21] CHEN Q, ZHUO Z, WANG W, et al.BERT for joint intent classification and slot filling[EB/OL].[2020-10-02].https://arxiv.org/pdf/1902.10909.pdf. [22] QIN L, CHE W, LI Y, et al.A stack-propagation framework with token-level intent detection for spoken language understanding[EB/OL].[2020-10-02].https://aclanthology.org/D19-1214.pdf. [23] 高永兵, 李越超.微博中的社交意图识别与分类技术研究[J].内蒙古科技大学学报, 2020, 39(2):85-89. GAO Y B, LI Y C.Social intent recognitionand classification in Weibo[J].Journal of Inner Mongolia University of Science and Technology, 2020, 39(2):85-89.(in Chinese) [24] MAO H H, MAJUMDER B P, MCAULEY J, et al.Improving neural story generation by targeted common sense grounding[EB/OL].[2020-10-02].https://arxiv.org/pdf/1908.09451v1.pdf. [25] KESKAR N, MCCANN B, VARSHNEY L R, et al.CTRL:a conditional transformer language model for controllable generation[EB/OL].[2020-10-02].https://einstein.ai/presentations/ctrl.pdf. [26] VASWANI A, NOAM S, NIKI P, et al.Attention is all you need[EB/OL].[2020-10-02].https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf. [27] ZACHARY M Z, LUKE M K, SEBASTIAN G, et al.Encoder-agnostic adaptation for conditional language generation[EB/OL].[2020-10-02].https://arxiv.org/pdf/1908.06938.pdf. [28] DATHATHRI S, ANDREA M, JANICE L, et al.Plug and play language models:a simple approach to controlled text generation[EB/OL].[2020-10-02].https://arxiv.org/pdf/1912.02164.pdf. [29] ANGELA F, MIKE L, YANN D.Hierarchical neural story generation[EB/OL].[2020-10-02].https://aclanthology.org/P18-1082.pdf. |