Sequence-to-Sequence Models for Data-to-Text Natural Language Generation:Word- vs. Character-based Processing and Output Diversity

Jagfeld, Glorianna and Jenne, Sabrina and Vu, Ngoc Thang (2018) Sequence-to-Sequence Models for Data-to-Text Natural Language Generation:Word- vs. Character-based Processing and Output Diversity. In: Proceedings of the 11th International Natural Language Generation Conference. Association for Computational Linguistics, NLD, pp. 221-232. ISBN 9781948087865

Full text not available from this repository.

Abstract

We present a comparison of word-based and character-based sequence-to sequence models for data-to-text natural language generation, which generate natural language descriptions for structured inputs. On the datasets of two recent generation challenges, our models achieve comparable or better automatic evaluation results than the best challenge submissions. Subsequent detailed statistical and human analyses shed light on the differences between the two input representations and the diversity of the generated texts. In a controlled experiment with synthetic training data generated from templates, we demonstrate the ability of neural models to learn novel combinations of the templates and thereby generalize beyond the linguistic structures they were trained on.

Item Type:
Contribution in Book/Report/Proceedings
ID Code:
130535
Deposited By:
Deposited On:
17 Jan 2019 15:15
Refereed?:
Yes
Published?:
Published
Last Modified:
23 Sep 2020 07:03