[1] DEVLIN J, CHANG M W, LEE K, et al.BERT:pre-training of deep bidirectional transformers for language understanding[EB/OL].[2021-06-20].https://arxiv.org/abs/1810.04805. [2] CHEN Q, ZHUO Z, WANG W.BERT for joint intent classification and slot filling[EB/OL].[2021-06-20].https://arxiv.org/abs/1902.10909. [3] YANG A, WANG Q, LIU J, et al.Enhancing pre-trained language representations with rich knowledge for machine reading comprehension[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.Stroudsburg, USA:Association for Computational Linguistics, 2019:218-229. [4] MILLER G A.WordNet[J].Communications of the ACM, 1995, 38(11):39-41. [5] CARLSON A, BETTERIDGE J, KISIEL B.Toward an architecture for never-ending language learning[C]//Proceedings of the 24th AAAI Conference on Artificial Intelligence.[S.1.]:AAAI Press, 2010:365-379. [6] YANG B S, MITCHELL T.Leveraging knowledge bases in LSTMs for improving machine reading[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics.Washington D.C., USA:IEEE Press, 2017:332-346. [7] GUO D, TUR G, YIH W T, et al.Joint semantic utterance classification and slot filling with recursive neural networks[C]//Proceedings of IEEE Spoken Language Technology Workshop.Washington D.C., USA:IEEE Press, 2015:554-559. [8] XU P Y, SARIKAYA R.Convolutional neural network based triangular CRF for joint intent detection and slot filling[C]//Proceedings of IEEE Workshop on Automatic Speech Recognition and Understanding.Washington D.C., USA:IEEE Press, 2014:78-83. [9] LIU B, LANE I.Attention-based recurrent neural network models for joint intent detection and slot filling[EB/OL].[2021-06-20].https://arxiv.org/abs/1609.01454. [10] ZHANG X D, WANG H F.A joint model of intent determination and slot filling for spoken language understanding[C]//Proceedings of the 25th International Joint Conference on Artificial Intelligence.New York, USA:ACM Press, 2016:2993-2999. [11] GOO C W, GAO G, HSU Y K, et al.Slot-gated modeling for joint slot filling and intent prediction[C]//Proceedings of 2018 Conference of the North American Chapter of the Association for Computational Linguistic.Washington D.C., USA:IEEE Press, 2018:1257-1266. [12] HAIHONG E, NIU P Q, CHEN Z F, et al.A novel bi-directional interrelated model for joint intent detection and slot filling[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.Washington D.C., USA:IEEE Press, 2019:457-471. [13] QIN L B, CHE W X, LI Y M, et al.A stack-propagation framework with token-level intent detection for spoken language understanding[C]//Processing of the 9th International Joint Conference on Natural Language Processing.Washington D.C., USA:IEEE Press, 2019:753-766. [14] ZHANG Y, WEISS D.Stack-propagation:improved representation learning for syntax[EB/OL].[2021-06-20].https://arxiv.org/abs/1603.06598. [15] HEMPHILL C T, GODFREY J J, DODDINGTON G R.The atis spoken language systems pilot corpus[EB/OL].[2021-06-20].https://aclanthology.org/H90-1021/. [16] COUCKE A, SAADE A, BALL A, et al.Snips voice platform:an embedded spoken language understanding system for private-by-design voice interfaces[EB/OL].[2021-06-20].https://arxiv.org/abs/1805.10190. [17] HOCHREITER S, SCHMIDHUBER J.Long short-term memory[J].Neural Computation, 1997, 9(8):1735-1780. [18] ZHONG V, XIONG C, SOCHER R.Global-locally self-attentive dialogue state tracker[EB/OL].[2021-06-20].https://arxiv.org/abs/1805.09655. [19] VASWANI A, SHAZEER N, PARMAR N, et al.Attention is all you need[C]//Proceedings of the 31th Annual Conference on Neural Information Processing Systems.Cambridge, USA:MIT Press, 2017:367-378. [20] BA J L, KIROS J R, HINTON G E.Layer normalization[EB/OL].[2021-06-20].https://arxiv.org/abs/1607.06450. [21] KINGMA D P, BA J.Adam:a method for stochastic optimization[EB/OL].[2021-06-20].https://arxiv.org/abs/1412.6980. [22] ZAREMBA W, SUTSKEVER I, VINYALS O.Recurrent neural network regularization[EB/OL].[2021-06-20].https://arxiv.org/abs/1409.2329. [23] KIM Y.Convolutional neural networks for sentence classification[EB/OL].[2021-06-20].https://arxiv.org/abs/1408.5882. [24] LAFFERTY J, MCCALLUM A, PEREIRA F C N.Conditional random fields:probabilistic models for segmenting and labeling sequence data[EB/OL].[2021-06-20].https://repository.upenn.edu/cgi/viewcontent.cgi?article=1162&context=cis_papers. [25] BURGES C J, BURGES C.A tutorial on support vector machines for pattern recognition[EB/OL].[2021-06-20].https://www.microsoft.com/en-us/research/publication/a-tutorial-on-support-vector-machines-for-pattern-recognition/?from=http%3A%2F%2Fresearch.microsoft.com%2Fpubs%2F67119%2Fsvmtutorial.pdf. |