langue fr Télécom Paris - 02/12/2019

MLAI 2019 #11. Lingfei Wu Graph-to-Sequence Learning in Natural Language Processing

Speaker: Lingfei Wu, Research Staff Member in the IBM AI Foundations Labs

Abstract: The celebrated Seq2Seq technique and its numerous variants achieve excellent performance on many tasks such as neural machine translation, natural language generation, speech recognition, and drug discovery. Despite their flexibility and expressive power, a significant limitation with the Seq2Seq models is that a neural network can only be applied to problems whose inputs are represented as sequences. However, the sequences are probably the simplest structured data and many important problems are best expressed with a complex structure such as a graph. On one hand, these graph-structured data can encode complicated pairwise relationships for learning more informative representations; On the other hand, the structural and semantic information in sequence data can be exploited to augment original sequence data by incorporating the domain-specific knowledge.

To cope with the complex structured graph inputs, we propose Graph2Seq, a novel attention-based neural network architecture for graph-to-sequence learning. Our Graph2Seq can be viewed as a generalized Seq2Seq model for graph inputs, which a general end-to-end neural encoder-decoder architecture that encodes an input graph and decodes the target sequence. In this talk, I will first introduce our Graph2Seq model, and then talk about how to apply this model in different NLP tasks. In particular, we illustrate the advantages of our Graph2Seq model over various Seq2Seq models and Tree2Seq models in our two recent works (Exploiting Rich Syntactic Information for Semantic Parsing with Graph-to-Sequence Model, EMNLP 2018) and (SQL-to-Text Generation with Graph-to-Sequence Model, EMNLP 2018).

See others videos of the MLAI workshop on www.mlai-workshop.org