[1] 邹品荣, 肖锋, 张文娟, 等.面向视觉问答的多模块协同注意模型[J].计算机工程, 2022, 48(2):250-260. ZOU P R, XIAO F, ZHANG W J, et al.Multi-module co-attention model for visual question answering[J].Computer Engineering, 2022, 48(2):250-260.(in Chinese) [2] 苏珂, 黄瑞阳, 张建朋, 等.多跳机器阅读理解研究进展[J].计算机工程, 2021, 47(9):1-17. SU K, HUANG R Y, ZHANG J P, et al.Research progress of multi-hop machine reading comprehension[J].Computer Engineering, 2021, 47(9):1-17.(in Chinese) [3] 赵芸, 刘德喜, 万常选, 等.检索式自动问答研究综述[J].计算机学报, 2021, 44(6):1214-1232. ZHAO Y, LIU D X, WAN C X, et al.Retrieval-based automatic question answer:a literature survey[J].Chinese Journal of Computers, 2021, 44(6):1214-1232.(in Chinese) [4] DENG C Y, ZENG G F, CAI Z P, et al.A survey of knowledge based question answering with deep learning[J].Journal on Artificial Intelligence, 2020, 2(4):157-166. [5] ABDELAZIZ I, RAVISHANKAR S, KANPANIPATHI P, et al.A semantic parsing and reasoning-based approach to knowledge base question answering[C]//Proceedings of AAAI Conference on Artificial Intelligence.Palo Alto, USA:AAAI Press, 2021:15985-15987. [6] CAO R S, CHEN L, CHEN Z, et al.LGESQL:line graph enhanced text-to-SQL model with mixed local and non-local relations[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing.Stroudsburg, USA:Association for Computational Linguistics, 2021:2541-2555. [7] ZHOU G B, LUO P, CAO R Y, et al.Tree-structured neural machine for linguistics-aware sentence generation[EB/OL].[2022-03-12].https://arxiv.org/abs/1705.00321. [8] CLARK K, LUONG M T, LE Q V, et al.ELECTRA:pre-training text encoders as discriminators rather than generators[EB/OL].[2022-03-12].https://arxiv.org/abs/2003.10555. [9] XU X J, LIU C, SONG D.SQLNet:generating structured queries from natural language without reinforcement learning[EB/OL].[2022-03-12].https://arxiv.org/abs/1711.04436. [10] JHA D, WARD L, YANG Z J, et al.IRNet:a general purpose deep residual regression framework for materials discovery[C]//Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.New York, USA:ACM Press, 2019:2385-2393. [11] WANG B L, SHIN R, LIU X D, et al.RAT-SQL:relation-aware schema encoding and linking for text-to-SQL parsers[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.Stroudsburg, USA:Association for Computational Linguistics, 2020:7567-7578. [12] DONG L, LAPATA M.Coarse-to-fine decoding for neural semantic parsing[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics.Stroudsburg, USA:Association for Computational Linguistics, 2018:731-742. [13] YIN P C, FANG H, NEUBIG G, et al.Compositional generalization for neural semantic parsing via span-level supervised attention[C]//Proceedings of 2021 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Stroudsburg, USA:Association for Computational Linguistics, 2021:2810-2823. [14] RUBIN O, BERANT J.SmBoP:semi-autoregressive bottom-up semantic parsing[C]//Proceedings of the 5th Workshop on Structured Prediction for NLP.Stroudsburg, USA:Association for Computational Linguistics, 2021:12-21. [15] YU T, WU C S, LIN X V, et al.GraPPa:grammar-augmented pre-training for table semantic parsing[EB/OL].[2022-03-12].https://arxiv.org/abs/2009.13845. [16] SHI P, NG P, WANG Z G, et al.Learning contextual representations for semantic parsing with generation-augmented pre-training[EB/OL].[2022-03-12].https://arxiv.org/abs/2012.10309. [17] SHAW P, USZKOREIT J, VASWANI A.Self-attention with relative position representations[C]//Proceedings of 2018 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Stroudsburg, USA:Association for Computational Linguistics, 2018:464-468. [18] DOZAT T, MANNING C D.Deep Biaffine attention for neural dependency parsing[EB/OL].[2022-03-12].https://arxiv.org/abs/1611.01734. [19] YU T, ZHANG R, YANG K, et al.Spider:a large-scale human-labeled dataset for complex and cross-domain semantic parsing and text-to-SQL task[C]//Proceedings of 2018 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:Association for Computational Linguistics, 2018:3911-3921. [20] SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al.Dropout:a simple way to prevent neural networks from overfitting[J].The Journal of Machine Learning Research, 2014, 15(1):1929-1958. [21] ZHONG V, LEWIS M, WANG S I, et al.Grounded adaptation for zero-shot executable semantic parsing[C]//Proceedings of 2020 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:Association for Computational Linguistics, 2020:6869-6882. [22] MELLAH Y, RHOUATI A, ETTIFOURI E H, et al.SQL generation from natural language:a sequence-to-sequence model powered by the transformers architecture and association rules[J].Journal of Computer Science, 2021, 17(5):480-489. [23] LIN X V, SOCHER R, XIONG C M.Bridging textual and tabular data for cross-domain text-to-SQL semantic parsing[C]//Proceedings of EMNLP'20.Stroudsburg, USA:Association for Computational Linguistics, 2020:4870-4888. [24] HUANG J, WANG Y, WANG Y, et al.Relation aware semi-autoregressive semantic parsing for NL2SQL[EB/OL].[2022-03-12].https://arxiv.org/abs/2108.00804. [25] GAN Y J, CHEN X Y, XIE J X, et al.Natural SQL:making SQL easier to infer from natural language specifications[C]//Proceedings of EMNLP'21.Stroudsburg, USA:Association for Computational Linguistics, 2021:2030-2042. [26] SCHOLAK T, SCHUCHER N, BAHDANAU D.PICARD:parsing incrementally for constrained auto-regressive decoding from language models[C]//Proceedings of 2021 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:Association for Computational Linguistics, 2021:9895-9901. [27] RAFFEL C, SHAZEER N, ROBERTS A, et al.Exploring the limits of transfer learning with a unified text-to-text transformer[J].Journal of Machine Learning Research, 2020, 21:1-67. |