Please wait a minute...
Advanced Search
数据分析与知识发现  2023, Vol. 7 Issue (12): 1-21     https://doi.org/10.11925/infotech.2096-3467.2022.1074
  综述评介 本期目录 | 过刊浏览 | 高级检索 |
文本神经语义解析方法研究进展
沈凌云1,2,乐小虬1,2()
1中国科学院文献情报中心 北京 100190
2中国科学院大学经济与管理学院信息资源管理系 北京 100190
Review of Text Neural Semantic Parsing Methods
Shen Lingyun1,2,Le Xiaoqiu1,2()
1National Science Library, Chinese Academy of Sciences, Beijing 100190, China
2Department of Information Resources Management, School of Economics and Management, University of Chinese Academy of Sciences, Beijing 100190, China
全文: PDF (1378 KB)   HTML ( 10
输出: BibTeX | EndNote (RIS)      
摘要 

【目的】对近10年利用神经网络研究文本语义解析的方法进行归纳和评述。【文献范围】以谷歌学术和中国知网为检索平台,分别以“Neural Semantic Parsing”和“神经语义解析”为关键词,筛选2010年-2022年的相关文献及其重要引文进行分析。【方法】对神经语义解析方法按照技术路径进行分类,剖析各技术路径的基本思路,对比分析各技术方法在数据、性能、应用目标等方面的异同点,归纳文本神经语义解析技术存在的问题及发展趋势。【结果】将现有神经语义解析方法归纳为序列到序列、借助中间形式以及语义单元分解与组合三种类型,后两种方法是对第一种方法的改进。中间表示形式,如语义草图、规范话语和少样本神经语义解析,是当前研究的主要关注点。【局限】主要从方法论上对现有研究思路进行归纳分析,对于神经语义解析模型内部实现机理未做细致阐述。【结论】目前神经语义解析方法在文本语义解析中能够获得最佳性能,面向具体应用设计针对性的神经网络模型是当前主流做法,但语义解析效果与实际应用仍然有一定差距。

服务
把本文推荐给朋友
加入引用管理器
E-mail Alert
RSS
作者相关文章
沈凌云
乐小虬
关键词 语义解析神经网络模型语义表示预训练    
Abstract

[Objective] This paper summarizes and comments on the research methods of text semantic parsing with neural networks in the past decade. [Coverage] With Google Scholar and CNKI as the data retrieval platforms, and “Neural Semantic Parsing” as the keywords, all relevant papers and their important citations from 2010 to 2022 were retrieved for analysis. [Methods] The paper classified the existing neural semantic parsing methods according to the technical path, explained the basic ideas of each technical path, compared and analyzed the similarities and differences of each technology method in data, performance, application goals, etc., and summarized the existing problems and development tendency of text neural semantic parsing technology. [Results] Neural semantic parsing methods could be summarized into three types, sequence to sequence method, intermediate form based method, and semantic unit decomposition and combination method. The latter two methods are improvements to the first method. At present, intermediate representations such as semantic sketch, canonical utterance and few-shot neural semantic parsing are the main research focuses. [Limitations] The paper mainly summarized and analyzed the existing research ideas from the methodology, but does not elaborate the internal implementation mechanism of the neural semantic parsing models. [Conclusions] The neural semantic parsing method gains the best performance in text semantic parsing at present. The current popular practice is to design targeted neural network models for specific applications. But the effect of semantic parsing is still far from the practical application.

Key wordsSemantic Parsing    Neural Network Model    Semantic Representation    Pre-training
收稿日期: 2022-10-13      出版日期: 2024-02-02
ZTFLH:  G350  
通讯作者: 乐小虬,ORCID:0000-0002-7114-5544,E-mail:lexq@mail.las.ac.cn。   
引用本文:   
沈凌云, 乐小虬. 文本神经语义解析方法研究进展[J]. 数据分析与知识发现, 2023, 7(12): 1-21.
Shen Lingyun, Le Xiaoqiu. Review of Text Neural Semantic Parsing Methods. Data Analysis and Knowledge Discovery, 2023, 7(12): 1-21.
链接本文:  
https://manu44.magtech.com.cn/Jwk_infotech_wk3/CN/10.11925/infotech.2096-3467.2022.1074      或      https://manu44.magtech.com.cn/Jwk_infotech_wk3/CN/Y2023/V7/I12/1
Fig.1  神经语义解析的基本组成
场景 自然语言语句 语义形式化表示
数学问题解算[8] Dan have 2 pens,Jessica have 4 pens. How many pens do they have in total ? x = 4+2
逻辑查询[5] what microsoft jobs do not require a bscs? answer(J,(company(J,'microsoft'),job(J),not((req_deg(J,'bscs')))))
问答[63] who has published the most articles? argmax(type.person;R(x:count(type.article u author:x)))
程序生成[4] Adds a scalar to this vector in place public void add(final double arg0){
for (int i = 0;i < vecElements.length();i++){
vecElements[i] += arg0;}}
机器人指令生成[64] Go away from the lamp to the intersection of the red brick and wood Turn(),
Travel(steps:1)
Table 1  不同语义解析任务场景需要的语义表示形式示例
Fig.2  不同神经语义解析方法流程
Fig.3  带注意力机制的解码器解码过程
Fig.4  基于草图的语义解析与端到端语义解析方法对比
Fig.5  三种用于语义解析任务的预训练模型微调策略
方法类型 文献 模型结构 语义解析
目标
方法特点
序列到序列
的方法
Mei等[28] LSTM编码器-解码器 导航序列 多层对齐机制;准确度超过机器学习算法
Wang等[8] 编码器GRU-解码器LSTM架构 数学方程式 首次使用神经网络解决数学语义解析问题;需要相似度检索
Ko?isky等[72] 三层LSTM编码层-LSTM解码层-Attention 知识库查询 随机生成伪数据的半监督方法
Babu等[30] 基于CNN的编码器-解码器架构-多头注意力 知识库查询 并行解码,计算速度快
Ling等[31] 指针网络 程序语言 使用指针网络解码器,性能超过Dong等[5]注意力的方法
Rongali等[35] BERT编码器-Transformers解码器 逻辑查询 性能首次超过人类水平的语义解析
Chen等[56] BART编码器-Transformers解码器 语义框架 改进了BART在语义解析任务上的效果
Wang等[75] BiLSTM编码器-LSTM-Attention解码器 数学方程式 使用预设方程模板约束解码,优化了输出方程的格式
Bogin等[78] GNN-LSTM编码器+解码器 数据表查询 将数据库结构和文本共同输入编码器,增加编码和解码过程结构信息决策
Wang等[81] BiLSTM-Self Attention SQL 改进了Bogin等[78]的工作,增加文本和表、列名的链接
Yavuz等[83] LSTM编码器-解码器-实体类型识别函数 知识库问答 在RNN序列到序列模型基础上,额外增加识别答案实体类型
借助中间形式
的方法
Dong等[87] 共享BiLSTM编码器-粗略意义解码器;BiLSTM+Attention-逻辑形式生成解码器RNN-Attention λ-演算式 首次提出将语义解析分为两阶段解决的问题,在跨领域任务上优势明显
Nye等[86] 草图生成器:RNN-Attention;符号推理合成器 程序语言 手工设计介于自然语言和程序之间的中间草图,用RNN生成符号系统
Xu等[89] BiLSTM-列Attention SQL 设计了通用的草图模板,较所有直接生成的方法准确度提升9%~13%
Shin等[94] GPT3-Prompt λ-演算式 通过微调预训练模型生成释义
基于分解与
组合的方法
Zhong等[32] 指针网络-强化学习 SQL 模型内部分别使用注意力生成对应逻辑式从句,较直接生成的方法准确度提高近20%
Lindemann等[97] 分解算法-BERT AMR 适用于跨图语言的语义解析任务
Li等[99] 移进-规约算法-Bi-GRU解码 逻辑形式 算法和模型简单,精度高
Table 2  神经语义解析代表方法对比
[1] Kamath A, Das R. A Survey on Semantic Parsing[OL]. arXiv Preprint, arXiv:1812.00978.
[2] Lee C, Gottschlich J, Roth D. A Survey on Semantic Parsing for Machine Programming[C]// Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2021.
[3] Yang J F, Jiang H M, Yin Q Y, et al. SEQZERO: Few-Shot Compositional Semantic Parsing with Sequential Prompts and Zero-Shot Models[C]// Findings of the Association for Computational Linguistics:NAACL 2022. 2022: 49-60.
[4] Iyer S, Cheung A, Zettlemoyer L. Learning Programmatic Idioms for Scalable Semantic Parsing[C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. 2019: 5426-5435.
[5] Dong L, Lapata M. Language to Logical Form with Neural Attention[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers). 2016: 33-43.
[6] Krishnamurthy J, Dasigi P, Gardner M. Neural Semantic Parsing with Type Constraints for Semi-Structured Tables[C]// Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 2017: 1516-1526.
[7] Bengio Y, Ducharme R, Vincent P, et al. A Neural Probabilistic Language Model[J]. The Journal of Machine Learning Research, 2000, 3: 1137-1155.
[8] Wang Y, Liu X J, Shi S M. Deep Neural Solver for Math Word Problems[C]// Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 2017: 845-854.
[9] Mikolov T, Chen K, Corrado G. et al. Efficient Estimation of Word Representations in Vector Space[OL]. arXiv Preprint, arXiv:1301.3781.
[10] Pennington J, Socher R, Manning C. GloVe: Global Vectors for Word Representation[C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. 2014: 1532-1543.
[11] Peters M E, Neumann M, Iyyer M, et al. Deep Contextualized Word Representations[C]// Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. 2018, 1: 2227-2237.
[12] Radford A, Narasimhan K, Salimans T, et al. Improving Language Understanding by Generative Pre-training[R]. OpenAI Technical Report, 2018.
[13] Devlin J, Chang M W, Lee K, et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies, Volume 1. 2018: 4171-4186.
[14] Liu Y H, Ott M, Goyal N, et al. RoBERTa: A Robustly Optimized Bert Pretraining Approach[OL]. arXiv Preprint, arXiv:1907.11692.
[15] Lewis M, Liu Y H, Goyal N, et al. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension[OL]. arXiv Preprint, arXiv:1910.13461.
[16] Le Q, Mikolov T. Distributed Representations of Sentences and Documents[C]// Proceedings of the 31st International Conference on Machine Learning. 2014.
[17] Dai A M, Olah C, Le Q V. Document Embedding with Paragraph Vectors[OL]. arXiv Preprint, arXiv:1507.07998.
[18] Zhang M H, Wu Y F, Li W K, et al. Learning Universal Sentence Representations with Mean-Max Attention Autoencoder[C]// Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 2018: 4514-4523.
[19] Wu L F, Yen I E H, Xu K, et al. Word Mover’s Embedding: From Word2Vec to Document Embedding[C]// Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 2018: 4524-4534.
[20] Vinyals O, Fortunato M, Jaitly N. Pointer Networks[OL]. arXiv Preprint, arXiv:1506.03134.
[21] Radford A, Wu J, Child R, et al. Language Models Are Unsupervised Multitask Learners[R]. OpenAI Blog, 2019.
[22] Brown T B, Mann B, Ryder N, et al. Language Models Are Few-Shot Learners[C]// Proceedings of the 34th International Conference on Neural Information Processing Systems. 2020: 1877-1901.
[23] Raffel C, Shazeer N, Roberts A, et al. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer[J]. The Journal of Machine Learning Research, 2020, 21(1): 5485-5551.
[24] Das D, Schneider N, Chen D S, et al. Probabilistic Frame-Semantic Parsing[C]// Proceedings of the 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics. 2010: 948-956.
[25] Grefenstette E, Blunsom P, de Freitas N, et al. A Deep Architecture for Semantic Parsing[C]// Proceedings of the ACL 2014 Workshop on Semantic Parsing. 2014: 22-27.
[26] 孔令富, 高胜男, 吴培良. 面向室内服务的中文语音指令深层信息解析系统[J]. 高技术通讯, 2014, 24(11): 1101-1107.
[26] (Kong Lingfu, Gao Shengnan, Wu Peiliang. An Indoor Service System for Parsing of Deep Information from Chinese Voice Commands[J]. Chinese High Technology Letters, 2014, 24(11): 1101-1107.)
[27] Jia R, Liang P. Data Recombination for Neural Semantic Parsing[OL]. arXiv Preprint, arXiv:1606.03622.
[28] Mei H Y, Bansal M, Walter M R. Listen, Attend, and Walk: Neural Mapping of Navigational Instructions to Action Sequences[C]// Proceedings of the 30th AAAI Conference on Artificial Intelligence. 2016: 2772-2778.
[29] Yih W T, He X D, Meek C. Semantic Parsing for Single-Relation Question Answering[C]// Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 2:Short Papers). 2014: 643-648.
[30] Babu A R, Shrivastava A, Aghajanyan A, et al. Non-autoregressive Semantic Parsing for Compositional Task-Oriented Dialog[C]// Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. 2021: 2969-2978.
[31] Ling W, Grefenstette E, Hermann K M, et al. Latent Predictor Networks for Code Generation[OL]. arXiv Preprint, arXiv:1603.06744.
[32] Zhong V, Xiong C M, Socher R. Seq2SQL: Generating Structured Queries from Natural Language Using Reinforcement Learning[OL]. arXiv Preprint, arXiv:1709.00103.
[33] Vaswani A, Shazeer N, Parmar N, et al. Attention is All You Need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017: 6000-6010.
[34] Hwang W, Yim J, Park S, et al. A Comprehensive Exploration on WikiSQL with Table-Aware Word Contextualization[OL]. arXiv Preprint, arXiv:1902.01069.
[35] Rongali S, Soldaini L, Monti E, et al. Don’t Parse, Generate! A Sequence to Sequence Architecture for Task-Oriented Semantic Parsing[C]// Proceedings of the Web Conference 2020. 2020: 2962-2968.
[36] Lyu Q, Chakrabarti K, Hathi S, et al. Hybrid Ranking Network for Text-to-SQL[OL]. arXiv Preprint, arXiv:2008.04759.
[37] Lee C, Gottschlich J, Roth D. Toward Code Generation: A Survey and Lessons from Semantic Parsing[OL]. arXiv Preprint, arXiv:2105.03317.
[38] Cresswell M J. Formal Philosophy, Selected Papers of Richard Montague[J]. Philosophia, 1976, 6: 193-207.
doi: 10.1007/BF02383265
[39] Carpenter B. Type-Logical Semantics[M]. Cambridge: MIT Press, 1998.
[40] Getoor L, Taskar B. Markov Logic: A Unifying Framework for Statistical Relational Learning[M]. Cambridge: MIT Press, 2007.
[41] Enderton H B. A Mathematical Introduction to Logic[M]. Amsterdam: Elsevier, 2001.
[42] Poon H, Domingos P. Unsupervised Semantic Parsing[C]// Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 1. 2009: 1-10.
[43] Artzi Y, Zettlemoyer L. Bootstrapping Semantic Parsers from Conversations[C]// Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing. 2011: 421-432.
[44] Krishnamurthy J, Mitchell T M. Weakly Supervised Training of Semantic Parsers[C]// Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. 2012: 754-765.
[45] Liang P. Lambda Dependency-Based Compositional Semantics[OL]. arXiv Preprint, arXiv:1309.4408.
[46] Kwiatkowski T, Zettlemoyer L, Goldwater S, et al. Inducing Probabilistic CCG Grammars from Logical Form with Higher-Order Unification[C]// Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing. 2010: 1223-1233.
[47] Wang C, Xue N W, Pradhan S. A Transition-Based Algorithm for AMR Parsing[C]// Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. 2015: 366-375.
[48] Flanigan J, Thomson S, Carbonell J, et al. A Discriminative Graph-Based Parser for the Abstract Meaning Representation[C]// Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. 2014: 1426-1436.
[49] Konstas I, Iyer S, Yatskar M. et al. Neural AMR: Sequence-to-Sequence Models for Parsing and Generation[OL]. arXiv Preprint, arXiv:1704.08381.
[50] Hershcovich D, Abend O, Rappoport A. A Transition-Based Directed Acyclic Graph Parser for UCCA[OL]. arXiv Preprint, arXiv:1704.00552.
[51] Shi S M, Wang Y, Lin C Y, et al. Automatically Solving Number Word Problems by Semantic Parsing and Reasoning[C]// Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015: 1132-1142.
[52] Fillmore C J, Johnson C R, Petruck M R L. Background to FrameNet[J]. International Journal of Lexicography, 2003, 16(3): 235-250.
doi: 10.1093/ijl/16.3.235
[53] Fillmore C J. Chapter 10 Frame Semantics[A]//Cognitive Linguistics: Basic Readings[M]. Berlin: de Gruyter Mouton, 2006: 373-400.
[54] Fillmore C J, Baker C. A Frames Approach to Semantic Analysis[A]//The Oxford Handbook of Linguistic Analysis[M]. New York: Oxford University Press, 2012: 313-339.
[55] Gupta S, Shah R, Mohit M. et al. Semantic Parsing for Task Oriented Dialog Using Hierarchical Representations[OL]. arXiv Preprint, arXiv:1810.07942.
[56] Chen X L, Ghoshal A, Mehdad Y, et al. Low-Resource Domain Adaptation for Compositional Task-Oriented Semantic Parsing[C]// Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. 2020: 5090-5100.
[57] Desai S, Aly A. Diagnosing Transformers in Task-Oriented Semantic Parsing[C]// Findings of the Association for Computational Linguistics:ACL-IJCNLP 2021. 2021: 57-62.
[58] Liang P. Learning Executable Semantic Parsers for Natural Language Understanding[J]. Communications of the ACM, 2016, 59(9): 68-76.
[59] Banarescu L, Bonial C, Cai S, et al. Abstract Meaning Representation for Sembanking[C]// Proceedings of the 7th Linguistic Annotation Workshop and Interoperability with Discourse. 2013: 178-186.
[60] Abend O, Rappoport A. Universal Conceptual Cognitive Annotation (UCCA)[C]// Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics. 2013: 228-238.
[61] White A S, Reisinger D, Sakaguchi K, et al. Universal Decompositional Semantics on Universal Dependencies[C]// Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. 2016: 1713-1723.
[62] Abend O, Rappoport A. The State of the Art in Semantic Representation[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. 2017: 77-89.
[63] Wang Y S, Berant J, Liang P. Building a Semantic Parser Overnight[C]// Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. 2015: 1332-1342.
[64] Chen D L, Mooney R J. Learning to Interpret Natural Language Navigation Instructions from Observations[C]// Proceedings of the 25th AAAI Conference on Artificial Intelligence. 2011: 859-865.
[65] Sutskever I, Vinyals O, Le Q V. Sequence to Sequence Learning with Neural Networks[OL]. arXiv Preprint, arXiv:1409.3215.
[66] Zettlemoyer L S, Collins M. Learning to Map Sentences to Logical Form: Structured Classification with Probabilistic Categorial Grammars[OL]. arXiv Preprint, arXiv:1207.1420.
[67] 谢德峰, 吉建民. 融入句法感知表示进行句法增强的语义解析[J]. 计算机应用, 2021, 41(9): 2489-2495.
doi: 10.11772/j.issn.1001-9081.2020111863
[67] (Xie Defeng, Ji Jianmin. Syntax-Enhanced Semantic Parsing with Syntax-Aware Representation[J]. Journal of Computer Applications, 2021, 41(9): 2489-2495.)
doi: 10.11772/j.issn.1001-9081.2020111863
[68] 邓庆康, 李晓林. 采用BERT-BiLSTM-CRF模型的中文位置语义解析[J]. 软件导刊, 2022, 21(2): 37-42.
[68] (Deng Qingkang, Li Xiaolin. Semantic Analysis of Chinese Location Based on BERT-BiLSTM-CRF Model[J]. Software Guide, 2022, 21(2): 37-42.)
[69] 王鑫雷, 李帅驰, 杨志豪, 等. 基于预训练语言模型的中文知识图谱问答系统[J]. 山西大学学报(自然科学版), 2020, 43(4): 955-962.
[69] (Wang Xinlei, Li Shuaichi, Yang Zhihao, et al. Chinese Knowledge Base Question Answering System Based on Pre-trained Language Model[J]. Journal of Shanxi University(Natural Science Edition), 2020, 43(4): 955-962.)
[70] 范红杰, 李雪冬, 叶松涛. 面向电子病历语义解析的疾病辅助诊断方法[J]. 计算机科学, 2022, 49(1): 153-158.
doi: 10.11896/jsjkx.201100125
[70] (Fan Hongjie, Li Xuedong, Ye Songtao. Aided Disease Diagnosis Method for EMR Semantic Analysis[J]. Computer Science, 2022, 49(1): 153-158.)
doi: 10.11896/jsjkx.201100125
[71] Zhang Z Y, Han X, Liu Z Y, et al. ERNIE: Enhanced Language Representation with Informative Entities[C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019: 1441-1451.
[72] Kočiský T, Melis G, Grefenstette E, et al. Semantic Parsing with Semi-supervised Sequential Autoencoders[C]// Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. 2016: 1078-1087.
[73] Shao B, Gong Y Y, Bao J W, et al. Weakly Supervised Multi-task Learning for Semantic Parsing[C]// Proceedings of the 28th International Joint Conference on Artificial Intelligence. 2019: 3375-3381.
[74] Bahdanau D, Cho K, Bengio Y. Neural Machine Translation by Jointly Learning to Align and Translate[OL]. arXiv Preprint, arXiv:1409.0473.
[75] Wang L, Zhang D X, Zhang J P, et al. Template-Based Math Word Problem Solvers with Recursive Neural Networks[C]// Proceedings of the 33rd AAAI Conference on Artificial Intelligence and the 31st Innovative Applications of Artificial Intelligence Conference and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence. 2019: 7144-7151.
[76] Huang D Q, Liu J, Lin C Y, et al. Neural Math Word Problem Solver with Reinforcement Learning[C]// Proceedings of the 27th International Conference on Computational Linguistics. 2018: 213-223.
[77] Yin P C, Neubig G. A Syntactic Neural Model for General-Purpose Code Generation[OL]. arXiv Preprint, arXiv:1704.01696.
[78] Bogin B, Berant J, Gardner M. Representing Schema Structure with Graph Neural Networks for Text-to-SQL Parsing[C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019: 4560-4565.
[79] Li Y J, Tarlow D, Brockschmidt M, et al. Gated Graph Sequence Neural Networks[OL]. arXiv Preprint, arXiv:1511.05493.
[80] Sorokin D, Gurevych I. Modeling Semantics with Gated Graph Neural Networks for Knowledge Base Question Answering[C]// Proceedings of the 27th International Conference on Computational Linguistics. 2018: 3306-3317.
[81] Wang B L, Shin R, Liu X D, et al. RAT-SQL: Relation-Aware Schema Encoding and Linking for Text-to-SQL Parsers[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020: 7567-7578.
[82] Yu T, Zhang R, Yang K, et al. Spider: A Large-Scale Human-Labeled Dataset for Complex and Cross-Domain Semantic Parsing and Text-to-SQL Task[C]// Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 2018: 3911-3921.
[83] Yavuz S, Gur I, Su Y, et al. Improving Semantic Parsing via Answer Type Inference[C]// Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. 2016: 149-159.
[84] Shaw P, Massey P, Chen A, et al. Generating Logical Forms from Graph Representations of Text and Entities[OL]. arXiv Preprint, arXiv:1905.08407.
[85] Solar-Lezama A, Tancau L, Bodik R, et al. Combinatorial Sketching for Finite Programs[C]// Proceedings of the 12th International Conference on Architectural Support for Programming Languages and Operating Systems. 2006: 404-415.
[86] Nye M, Hewitt L, Tenenbaum J, et al. Learning to Infer Program Sketches[C]// Proceedings of the 36th International Conference on Machine Learning. 2019.
[87] Dong L, Lapata M. Coarse-to-Fine Decoding for Neural Semantic Parsing[OL]. arXiv Preprint, arXiv:1805.04793.
[88] 赵睿卓, 高金华, 孙晓茜, 等. 基于树形语义框架的神经语义解析方法[J]. 中文信息学报, 2021, 35(1): 9-16.
[88] (Zhao Ruizhuo, Gao Jinhua, Sun Xiaoqian, et al. Learning Tree-Structured Sketch for Neural Semantic Parsing[J]. Journal of Chinese Information Processing, 2021, 35(1): 9-16.)
[89] Xu X J, Liu C, Song D. SQLNet: Generating Structured Queries from Natural Language without Reinforcement Learning[OL]. arXiv Preprint, arXiv: 1711.04436.
[90] Li Z C, Lai Y X, Xie Y X, et al. A Sketch-Based System for Semantic Parsing[C]// Proceedings of the CCF International Conference on Natural Language Processing and Chinese Computing. 2019: 748-759.
[91] Ye X, Chen Q C, Wang X Y, et al. Sketch-Driven Regular Expression Generation from Natural Language and Examples[J]. Transactions of the Association for Computational Linguistics, 2020, 8: 679-694.
doi: 10.1162/tacl_a_00339
[92] Berant J, Liang P. Semantic Parsing via Paraphrasing[C]// Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. 2014: 1415-1425.
[93] Berant J, Chou A, Frostig R, et al. Semantic Parsing on Freebase from Question-Answer Pairs[C]// Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing. 2013: 1533-1544.
[94] Shin R, Lin C H, Thomson S, et al. Constrained Language Models Yield Few-Shot Semantic Parsers[OL]. arXiv Preprint, arXiv: 2104.08768.
[95] 李青, 钟将, 李立力, 等. 基于预训练机制的自修正复杂语义分析方法[J]. 通信学报, 2019, 40(12): 41-50.
doi: 10.11959/j.issn.1000-436x.2019195
[95] (Li Qing, Zhong Jiang, Li Lili, et al. Self-Correcting Complex Semantic Analysis Method Based on Pre-training Mechanism[J]. Journal on Communications, 2019, 40(12): 41-50.)
doi: 10.11959/j.issn.1000-436x.2019195
[96] 李青, 钟将, 李立力, 等. 一种依需聚合的语义解析图查询模型[J]. 电子学报, 2020, 48(4): 763-771.
doi: 10.3969/j.issn.0372-2112.2020.04.018
[96] (Li Qing, Zhong Jiang, Li Lili, et al. Semantic Parsing Graph Query Model for On-demand Aggregation[J]. Acta Electronica Sinica, 2020, 48(4): 763-771.)
doi: 10.3969/j.issn.0372-2112.2020.04.018
[97] Lindemann M, Groschwitz J, Koller A. Compositional Semantic Parsing Across Graphbanks[C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019: 4576-4585.
[98] Groschwitz J, Lindemann M, Fowlie M. et al. AMR Dependency Parsing with a Typed Semantic Algebra[OL]. arXiv Preprint, arXiv: 1805.11465.
[99] Li Y T, Chen B, Liu Q, et al. Keep the Structure: A Latent Shift-Reduce Parser for Semantic Parsing[C]// Proceedings of the 30th International Joint Conference on Artificial Intelligence. 2021: 3864-3870.
[100] Han X, Zhang Z Y, Ding N, et al. Pre-trained Models: Past, Present and Future[J]. AI Open, 2021, 2: 225-250.
doi: 10.1016/j.aiopen.2021.08.002
[101] Liang Z W, Zhang J P, Shao J, et al. MWP-BERT: A Strong Baseline for Math Word Problems[OL]. arXiv Preprint, arXiv: 2107.13435.
[102] Xu S L, Semnani S J, Campagna G, et al. AutoQA: From Databases to QA Semantic Parsers with Only Synthetic Training Data[OL]. arXiv Preprint, arXiv: 2010.04806.
[103] Wu S, Chen B, Xin C L, et al. From Paraphrasing to Semantic Parsing: Unsupervised Semantic Parsing via Synchronous Semantic Decoding[OL]. arXiv Preprint, arXiv: 2106.06228.
[104] Rongali S, Arkoudas K, Rubino M, et al. Training Naturalized Semantic Parsers with Very Little Data[OL]. arXiv Preprint, arXiv: 2204.14243.
[105] Schucher N, Reddy S, de Vries H. The Power of Prompt Tuning for Low-Resource Semantic Parsing[OL]. arXiv Preprint, arXiv: 2110.08525.
[106] Shin R, van Durme B. Few-Shot Semantic Parsing with Language Models Trained on Code[OL]. arXiv Preprint, arXiv: 2112.08696.
[107] Sun W Q, Khan H, des Mesnards N G, et al. Unfreeze with Care: Space-Efficient Fine-Tuning of Semantic Parsing Models[C]// Proceedings of the ACM Web Conference 2022. 2022: 999-1007.
[108] Li X L, Liang P. Prefix-Tuning: Optimizing Continuous Prompts for Generation[C]// Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1:Long Papers). 2021: 4582-4597.
[109] Zaken E B, Ravfogel S, Goldberg Y. BitFit: Simple Parameter-Efficient Fine-Tuning for Transformer-Based Masked Language-Models[C]// Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2:Short Papers). 2021: 1-9.
[110] Guo J Q, Liu Q, Lou J G, et al. Benchmarking Meaning Representations in Neural Semantic Parsing[C]// Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. 2020: 1520-1540.
[111] Sun Y B, Tang D Y, Duan N, et al. Semantic Parsing with Syntax-and Table-Aware SQL Generation[C]// Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. 2018: 361-372.
[112] Iyer S, Konstas I, Cheung A, et al. Learning a Neural Semantic Parser from User Feedback[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. 2017: 963-973.
[113] 左敏, 徐泽龙, 张青川, 等. 基于双维度中文语义分析的食品领域知识库问答[J]. 郑州大学学报(工学版), 2020, 41(3): 8-13.
[113] (Zuo Min, Xu Zelong, Zhang Qingchuan, et al. A Question Answering Model of Food Domain Knowledge Bases with Two-Dimension Chinese Semantic Analysis[J]. Journal of Zhengzhou University (Engineering Science), 2020, 41(3): 8-13.)
[114] Bobrow D G. Natural Language Input for a Computer Problem Solving System[M]. Cambridge: Massachusetts Institute of Technology, 1964.
[115] Charniak E. Computer Solution of Calculus Word Problems[C]// Proceedings of the 1st International Joint Conference on Artificial Intelligence. 1969: 303-316.
[116] Hosseini M J, Hajishirzi H, Etzioni O, et al. Learning to Solve Arithmetic Word Problems with Verb Categorization[C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. 2014: 523-533.
[117] Kushman N, Artzi Y, Zettlemoyer L, et al. Learning to Automatically Solve Algebra Word Problems[C]// Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. 2014: 271-281.
[118] Zelle J M, Mooney R J. Learning to Parse Database Queries Using Inductive Logic Programming[C]// Proceedings of the 13th National Conference on Artificial Intelligence - Volume 2. 1996: 1050-1055.
[119] Quirk C, Mooney R, Galley M. Language to Code: Learning Semantic Parsers for If-This-Then-That Recipes[C]// Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. 2015: 878-888.
[120] Yao Z Y, Li X J, Gao J F, et al. Interactive Semantic Parsing for If-Then Recipes Via Hierarchical Reinforcement Learning[C]// Proceedings of the 33rd AAAI Conference on Artificial Intelligence. 2019: 2547-2554.
[121] Chen X Y, Liu C, Shin R, et al. Latent Attention for If-Then Program Synthesis[C]// Proceedings of the 30th International Conference on Neural Information Processing Systems. 2016: 4581-4589.
[122] Rabinovich M, Stern M, Klein D. Abstract Syntax Networks for Code Generation and Semantic Parsing[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. 2017: 1139-1149.
[123] Aho A V, Lam M S, Sethi R, et al. Compilers:Principles, Techniques, & Tools[M]. The 2nd Edtion. Boston: Pearson Addison-Wesley, 2007.
[124] Yin P C, Neubig G. A Syntactic Neural Model for General-Purpose Code Generation[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. 2017: 440-450.
[125] Yin P C, Neubig G. TRANX: A Transition-Based Neural Abstract Syntax Parser for Semantic Parsing and Code Generation[OL]. arXiv Preprint, arXiv: 1810.02720.
[126] Platanios E A, Pauls A, Roy S, et al. Value-Agnostic Conversational Semantic Parsing[C]// Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing. 2021: 3666-3681.
[127] Feng Z Y, Guo D Y, Tang D Y, et al. CodeBERT: A Pre-trained Model for Programming and Natural Languages[C]// Findings of the Association for Computational Linguistics:EMNLP 2020. 2020: 1536-1547.
[128] Jiang X, Zheng Z R, Lyu C, et al. TreeBERT: A Tree-Based Pre-trained Model for Programming Language[C]// Proceedings of the 37th Conference on Uncertainty in Artificial Intelligence. 2021: 54-63.
[129] Min Q K, Shi Y F, Zhang Y. A Pilot Study for Chinese SQL Semantic Parsing[C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. 2019: 3652-3658.
[130] Campagna G, Semnani S, Kearns R, et al. A Few-Shot Semantic Parser for Wizard-of-Oz Dialogues with the Precise ThingTalk Representation[C]// Findings of the Association for Computational Linguistics:ACL 2022. 2022: 4021-4034.
[131] Hui B Y, Geng R Y, Ren Q Y, et al. Dynamic Hybrid Relation Exploration Network for Cross-Domain Context-Dependent Semantic Parsing[C]// Proceedings of the 35th AAAI Conference on Artificial Intelligence. 2021: 13116-13124.
[1] 鲍彤, 章成志. ChatGPT中文信息抽取能力测评——以三种典型的抽取任务为例*[J]. 数据分析与知识发现, 2023, 7(9): 1-11.
[2] 付芸, 刘细文, 朱丽雅, 韩涛. 实验规程的过程级语义表示研究综述*[J]. 数据分析与知识发现, 2023, 7(8): 1-16.
[3] 刘江峰, 冯钰童, 刘浏, 沈思, 王东波. 领域双语数据增强的学术文本摘要结构识别研究*[J]. 数据分析与知识发现, 2023, 7(8): 105-118.
[4] 邓宇扬, 吴丹. 面向藏族传统节日的汉藏双语命名实体识别研究*[J]. 数据分析与知识发现, 2023, 7(7): 125-135.
[5] 陈诺, 李旭晖. 一种基于模板提示学习的事件抽取方法*[J]. 数据分析与知识发现, 2023, 7(6): 86-98.
[6] 邓娜, 何昕洋, 陈伟杰, 陈旭. MPMFC:一种融合网络邻里结构特征和专利语义特征的中药专利分类模型*[J]. 数据分析与知识发现, 2023, 7(4): 145-158.
[7] 赵朝阳, 朱贵波, 王金桥. ChatGPT给语言大模型带来的启示和多模态大模型新的发展思路*[J]. 数据分析与知识发现, 2023, 7(3): 26-35.
[8] 钱力, 刘熠, 张智雄, 李雪思, 谢靖, 许钦亚, 黎洋, 管铮懿, 李西雨, 文森. ChatGPT的技术基础分析*[J]. 数据分析与知识发现, 2023, 7(3): 6-15.
[9] 张逸勤, 邓三鸿, 胡昊天, 王东波. 预训练模型视角下的跨语言典籍风格计算研究*[J]. 数据分析与知识发现, 2023, 7(10): 50-62.
[10] 胡吉明, 钱玮, 文鹏, 吕晓光. 基于结构功能和实体识别的文本语义表示——以病历领域为例*[J]. 数据分析与知识发现, 2022, 6(8): 110-121.
[11] 景慎旗, 赵又霖. 基于医学领域知识和远程监督的医学实体关系抽取研究*[J]. 数据分析与知识发现, 2022, 6(6): 105-114.
[12] 叶瀚,孙海春,李欣,焦凯楠. 融合注意力机制与句向量压缩的长文本分类模型[J]. 数据分析与知识发现, 2022, 6(6): 84-94.
[13] 佟昕瑀, 赵蕊洁, 路永和. 基于预训练模型的多标签专利分类研究*[J]. 数据分析与知识发现, 2022, 6(2/3): 129-137.
[14] 陈杰,马静,李晓峰. 融合预训练模型文本特征的短文本分类方法*[J]. 数据分析与知识发现, 2021, 5(9): 21-30.
[15] 李文娜, 张智雄. 基于联合语义表示的不同知识库中的实体对齐方法研究*[J]. 数据分析与知识发现, 2021, 5(7): 1-9.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
版权所有 © 2015 《数据分析与知识发现》编辑部
地址:北京市海淀区中关村北四环西路33号 邮编:100190
电话/传真:(010)82626611-6626,82624938
E-mail:jishu@mail.las.ac.cn