Publications

Howcroft, David M.

Learning to generate: Bayesian nonparametric approaches to inducing rules for natural language generation

Saarland University, Saarbruecken, Germany, 2021.

In order for computers to produce natural language texts from non-linguistic information, we need a system for mapping between the two, a system of Natural Language Generation (NLG). We can reduce the difficulty of developing such systems if we leverage machine learning intelligently. While there are many possible approaches to the task, this thesis argues for one in particular, focusing on sentence planning using synchronous grammars and Bayesian nonparametric methods.

We formulate sentence planning rules in terms of Synchronous Tree Substitution Grammars (sTSGs) and implement a series of hierarchical Dirichlet Processes along with a Gibbs sampler to learn such rules from appropriate corpora. Due to the lack of corpora which pair hierarchical, discourse-structured meaning representations with varied texts, we developed a new interface for crowdsourcing training corpora for NLG systems by asking participants to produce paraphrases of pre-existing texts and collected a new corpus, which we call the Extended SPaRKy Restaurant Corpus (ESRC).

After training our models on pre-existing, lexically-restricted corpora as well as the ESRC, we conduct a series of human evaluations using a novel evaluation interface. This interface enables the assessment of the fluency, semantic fidelity, and expression of discourse relations in a text in a single crowdsourcing experiment. While we identify several limitations to our approach, the evaluations suggest that our models can outperform existing neural network models with respect to semantic fidelity and in some cases maintain similar levels of fluency.

In addition to these efforts, we present a Dependency Attachment Grammar (DAG) based on (Joshi & Rambow, 2003) and extend this grammar to the synchronous setting so that future work can build upon its added flexibility relative to sTSG. In addition to these practically-oriented efforts, we also explore human variation in adapting their utterances to listeners under cognitive load through a psycholinguistic study.

This thesis opens up several directions for future research into how best to integrate the various challenging tasks involved in natural language generation and how best to evaluate these systems in the future.

Back

Successfully