Neural Headline Generation on Abstract Meaning Representation

Sho Takase1, Jun Suzuki2, Naoaki Okazaki1, Tsutomu Hirao3, Masaaki Nagata4
1Tohoku University, 2NTT CS Lab., 3NTT Communication Science Labs., 4+81-774-93-5235


Abstract

Neural network-based encoder-decoder models are among recent attractive methodologies for tackling natural language generation tasks. This paper investigates the usefulness of structural syntactic and semantic information additionally incorporated in a baseline neural attention-based model. We encode results obtained from an abstract meaning representation (AMR) parser using a modified version of Tree-LSTM. Our proposed attention-based AMR encoder-decoder model improves headline generation benchmarks compared with the baseline neural attention-based model.