Skip to Main content Skip to Navigation
Theses

Neural-Symbolic Learning for Semantic Parsing

Abstract : Our goal in this thesis is to build a system that answers a natural language question (NL) by representing its semantics as a logical form (LF) and then computing the answer by executing the LF over a knowledge base. The core part of such a system is the semantic parser that maps questions to logical forms. Our focus is how to build high-performance semantic parsers by learning from (NL, LF) pairs. We propose to combine recurrent neural networks (RNNs) with symbolic prior knowledge expressed through context-free grammars (CFGs) and automata. By integrating CFGs over LFs into the RNN training and inference processes, we guarantee that the generated logical forms are well-formed; by integrating, through weighted automata, prior knowledge over the presence of certain entities in the LF, we further enhance the performance of our models. Experimentally, we show that our approach achieves better performance than previous semantic parsers not using neural networks as well as RNNs not informed by such prior knowledge
Complete list of metadatas

Cited literature [103 references]  Display  Hide  Download

https://tel.archives-ouvertes.fr/tel-01699569
Contributor : Abes Star :  Contact
Submitted on : Friday, February 2, 2018 - 3:05:06 PM
Last modification on : Tuesday, April 24, 2018 - 1:30:30 PM
Long-term archiving on: : Thursday, May 3, 2018 - 1:24:54 PM

File

DDOC_T_2017_0268_XIAO.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-01699569, version 1

Collections

Citation

Chunyang Xiao. Neural-Symbolic Learning for Semantic Parsing. Computation and Language [cs.CL]. Université de Lorraine, 2017. English. ⟨NNT : 2017LORR0268⟩. ⟨tel-01699569⟩

Share

Metrics

Record views

311

Files downloads

878