[1] LIU Y,STOLCKE A,SHRIBERG E,et al.Using conditional random fields for sentence boundary detection in speech[C]//Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics.[S.1.]:Association for Computational Linguistics,2005:451-458. [2] STOLCKE A,SHRIBERG E.Automatic linguistic segmentation of conversational speech[C]//Proceedings of the 4th International Conference on Spoken Language.Washington D.C.,USA:IEEE Press,1996:1005-1008. [3] FAVRE B,HAKKANI-TUR D,PETROV S,et al.Efficient sentence segmentation using syntactic features[C]//Proceedings of Spoken Language Technology Workshop.Washington D.C.,USA:IEEE Press,2008:77-80. [4] ROARK B,LIU Y,HARPER M,et al.Reranking for sentence boundary detection in conversational speech[C]//Proceedings of 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.Washington D.C.,USA:IEEE Press,2006:1-10. [5] TOMANEK K,WERMTER J,HAHN U.Sentence and token splitting based on conditional random fields[EB/OL].[2019-03-20].https://www.researchgate.net/publication. [6] GUO Y,WANG H,GENABITH J V.A linguistically inspired statistical model for Chinese punctuation generation[J].ACM Transactions on Asian Language Information Processing,2010,9(2):1-27. [7] LU W,NG H T.Better punctuation prediction with dynamic conditional random fields[C]//Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing.[S.1.]:Association for Computational Linguistics,2010:177-186. [8] HUANG J,ZWEIG G.Maximum entropy model for punctuation annotation from speech[C]//Proceedings of the 7th International Conference on Spoken Language Processing.Washington D.C.,USA:IEEE Press,2002:1-9. [9] COLOLBERT R,WESTON J,KARLEN M,et al.Natural language processing from scratch[J].Journal of Machine Learning Research,2011,12(1):2493-2537. [10] ZHANG D,WU S,YANG N,et al.Punctuation prediction with transition-based parsing[C]//Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics.Washington D.C.,USA:IEEE Press,2013:752-760. [11] CHEN Xinchi,QIU Xipeng,ZHU Chenxi,et al.Long short-term memory neural networks for Chinese word segmentation[C]//Proceedings of International Conference on Empirical Methods in Natural Language Processing.Washington D.C.,USA:IEEE Press,2015:1197-1206. [12] HUANG Zhiheng,XU Wei,YU Kai.Bidirectional LSTM-CRF models for sequence tagging[EB/OL].[2019-03-20].https//www.arXiv preprint arXiv:1508.01991. [13] LI Yakun,PAN Qing.Joint Chinese word segmentation and punctuation prediction based on improved multilayer BLSTM network[J].Journal of Computer Applications,2018,38(5):1278-1282,1314.(in Chinese)李雅昆,潘晴.Everett X.WANG.基于改进的多层BLSTM的中文分词和标点预测[J].计算机应用,2018,38(5):1278-1282,1314. [14] VASWANI A,SHAZEER N,PARMAR N,et al.Attention is all you need[EB/OL].[2019-03-20].https://www.douban.com/note/732780452/. [15] ZHANG Haoyu,ZHANG Pengfei,LI Zhenzhen,et al.Self-attention based machine reading comprehension[J].Journal of Chinese Information Processing,2018,32(12):125-131.(in Chinese)张浩宇,张鹏飞,李真真,等.基于自注意力机制的阅读理解模型[J].中文信息学报,2018,32(12):125-131. [16] ZEN Yifu,LAN Tian,WU Zufeng,et al.Bi-memory based attention model for aspect level sentiment classification[J].Chinese Journal of Computers,2019,42(8):1845-1857.(in Chinese)曾义夫,蓝天,吴祖峰,等.基于双记忆注意力的方面级别情感分类模型[J].计算机学报,2019,42(8):1845-1857. [17] HUANG Wenming,WEI Wancheng,ZHANG Jian,et al.Recommendation method based on attention mechanism and review text deep model[J].Computer Engineering,2019,45(9):176-182.(in Chinese)黄文明,卫万成,张健,等.融合注意力机制对评论文本深度建模的推荐方法[J].计算机工程,2019,45(9):176-182. [18] LUONG M T,PHAM H,MANNING C D.Effective approaches to attention-based neural machine translation[EB/OL].[2019-03-20].https://arxiv.org/abs/1508.04025. [19] CHO K,VAN MERRIENBOER B,BAHDANAU D,et al.On the properties of neural machine translation:Encoder-decoder approaches[EB/OL].[2019-03-20].https://arxiv.org/abs/1409.1259. [20] GRAVES A,MOHAMED A,HINTON G.Speech recognition with deep recurrent neural networks[C]//Proceedings of IEEE International Conference on Acoustics,Speech and Signal Processing.Washington D.C.,USA:IEEE Press,2013:6645-6649. [21] WANG Huijian,LIU Zheng,LI Yun,et al.Trend Prediction Method of Time Series Trends Based on Neural Network Language Model[J].Computer Engineering,2019,45(7):13-19,25.(in Chinese)王慧健,刘峥,李云,等.基于神经网络语言模型的时间序列趋势预测[J].计算机工程,2019,45(7):13-19,25. [22] ZHANG Zhen,LI Ning,TIAN Yingai.Stream document structure recognition based on bidirectional LSTM network[J].Computer Engineering,2020,46(1):60-66,73.(in Chinese)张真,李宁,田英爱.基于双向LSTM网络的流式文档结构识别[J].计算机工程,2020,46(1):60-66,73. [23] HOCHREITER S,SCHMIDHUBER J.Long short-term memory[J].Neural Computation,1997,9(8):1735-1780. [24] LI Shen,ZHAO Zhe,HU Renfen,et al.Analogical reasoning on Chinese morphological and semantic relations[EB/OL].[2019-03-20].https://arxiv.org/abs/1805.06504?context=cs. [25] SRIVASTAVA N,HINTON G,KRIZHEVSKY A,et al.Dropout:a simple way to prevent neural networks from overfitting[J].The Journal of Machine Learning Research,2014,15(1):1929-1958. [26] HINTON G E,SRIVASTAVA N,KRIZHEVSKY A,et al.Improving neural networks by preventing co-adaptation of feature detectors[EB/OL].[2019-03-20].https://arxiv.org/abs/1207.0580v1. [27] ZEILER M D.ADADELTA:an adaptive learning rate method[EB/OL].[2019-03-20].https://arxiv.org/abs/1212.5701. [28] ZHOU Jie,XU Wei.End-to-end learning of semantic role labeling using recurrent neural networks[C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing.Washington D.C.,USA:IEEE Press,2015:1127-1137. [29] HE Kaiming,ZHANG Xiangyu,REN Shaoqiang,et al.Deep residual learning for image recognition[C]//Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition.Las Vegas,USA:[s.n.],2016:770-778. |