[1] MENG Kui,LIU Mengchi,HU Jie.Query intention recognition model based on character level cyclic network[J].Computer Engineering,2017,43(3):181-186.(in Chinese)孟奎,刘梦赤,胡婕.基于字符级循环网络的查询意图识别模型[J].计算机工程,2017,43(3):181-186. [2] REN Jun,WANG Jianhua,WANG Chuanmei,et al.Stock index forecast based on regularized LSTM model[J].Computer Applications and Software,2018,35(4):44-48,108.(in Chinese)任君,王建华,王传美,等.基于正则化LSTM模型的股票指数预测[J].计算机应用与软件,2018,35(4):44-48,108. [3] HINTON G E.Connectionist learning procedures[J].Artificial Intelligence,1989,40(1/2/3):185-234. [4] SANTOS C D,ZADROZNY B.Learning character-level representations for part-of-speech tagging[C]//Proceedings of the 31st International Conference on Machine Learning.New York,USA:ACM Press,2014:1818-1826. [5] GAO Yunlong,ZUO Wanli,WANG Ying,et al.Sentence classification model based on sparse and self-taught convolutional neural networks[J].Journal of Computer Research and Development,2018,55(1):179-187.(in Chinese)高云龙,左万利,王英,等.基于稀疏自学习卷积神经网络的句子分类模型[J].计算机研究与发展,2018,55(1):179-187. [6] YI Meng,SUI Lichun.Aerial image semantic classification method based on improved full convolution neural network[J].Computer Engineering,2017,43(10):216-221.(in Chinese)易盟,隋立春.基于改进全卷积神经网络的航拍图像语义分类方法[J].计算机工程,2017,43(10):216-221. [7] DIELEMAN S,ZEN H,SIMONYAN K,et al.WaveNet:a generative model for raw audio[EB/OL].[2019-04-11].https://arxiv.org/abs/1609.03499v1. [8] KALCHBRENNER N,ESPEHOLT L,SIMONYAN K,et al.Neural machine translation in linear time[EB/OL].[2019-04-11].https://arxiv.org/abs/1610.10099. [9] GEHRING J,AULI M,GRANGIER D,et al.Convolutional sequence to sequence learning[EB/OL].[2019-04-11].https://arxiv.org/abs/1705.03122. [10] BAI S,KOLTER J Z,KOLTUN V.An empirical evaluation of generic convolutional and recurrent networks for sequence modeling[EB/OL].[2019-04-11].https://arxiv.org/abs/1803.01271. [11] CHOLLET F.Xception:deep learning with depthwise separable convolutions[C]//Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.Washington D.C.,USA:IEEE Press,2017:1251-1258. [12] HOWARD A G,ZHU M,CHEN B,et al.MobileNets:efficient convolutional neural networks for mobile vision applications[EB/OL].[2019-04-11].https://arxiv.org/abs/1704.04861. [13] HE Kaiming,ZHANG Xiangyu,REN Shaoqing,et al.Deep residual learning for image recognition[C]//Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.Washington D.C.,USA:IEEE Press,2016:770-778. [14] NAIR V,HINTON G E.Rectified linear units improve restricted Boltzmann machines[C]//Proceedings of the 27th International Conference on Machine Learning.Washington D.C.,USA:IEEE Press,2010:807-814. [15] SALIMANS T,KINGMA D P.Weight normalization:a simple reparameterization to accelerate training of deep neural networks[M].Cambridge,USA:MIT Press,2016. [16] YU F,KOLTUN V.Multi-scale context aggregation by dilated convolutions[EB/OL].[2019-04-11].https://arxiv.org/abs/1511.07122. [17] WANG Gensheng,HUANG Xuejian.Convolution neural network text classification model based on Word2vec and improved TF-IDF[J].Journal of Chinese Computer Systems,2019,40(5):1120-1126.(in Chinese)王根生,黄学坚.基于Word2vec和改进型TF-IDF的卷积神经网络文本分类模型[J].小型微型计算机系统,2019,40(5):1120-1126. [18] KAISER L,GOMEZ A N,CHOLLET F.Depthwise separable convolutions for neural machine translation[EB/OL].[2019-04-11].https://arxiv.org/abs/1706.03059. [19] LE Q V,JAITLY N,HINTON G E.A simple way to initialize recurrent networks of rectified linear units[EB/OL].[2019-04-11].https://arxiv.org/abs/1504.00941. [20] ARJOVSKY M,SHAH A,BENGIO Y.Unitary evolution recurrent neural networks[C]//Proceedings of International Conference on Machine Learning.New York,USA:ACM Press,2016:1120-1128. [21] ZHANG S,WU Y,CHE T,et al.Architectural complexity measures of recurrent neural networks[M].Cambridge,USA:MIT Press,2016. [22] JING L,SHEN Y,DUBCEK T,et al.Tunable Efficient Unitary Neural Networks(EUNN) and their application to RNNs[C]//Proceedings of the 34th International Conference on Machine Learning.New York,USA:ACM Press,2017:1733-1741. [23] WISDOM S,POWERS T,HERSHEY J,et al.Full-capacity unitary recurrent neural networks[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems.New York,USA:ACM Press,2016:4880-4888. [24] KRUEGER D,MAHARAJ T,KRAMÁR J,et al.Zoneout:regularizing RNNs by randomly preserving hidden activations[EB/OL].[2019-04-11].https://arxiv.org/abs/1606.01305. |