a1 Toshiba (China) Research and Development Center 5/F., Tower W2, Oriental Plaza, Dongcheng District, Beijing, 100738, China e-mail: firstname.lastname@example.org
a2 Baidu, Inc., Baidu Campus, No. 10, Shangdi 10th Street, Haidian District, Beijing, 100085, China e-mail: email@example.com
a3 NCLT/CNGL, School of Computing, Dublin City University Glasnevin, Dublin 9, Ireland e-mail: firstname.lastname@example.org
This paper presents a general-purpose, wide-coverage, probabilistic sentence generator based on dependency n-gram models. This is particularly interesting as many semantic or abstract syntactic input specifications for sentence realisation can be represented as labelled bi-lexical dependencies or typed predicate-argument structures. Our generation method captures the mapping between semantic representations and surface forms by linearising a set of dependencies directly, rather than via the application of grammar rules as in more traditional chart-style or unification-based generators. In contrast to conventional n-gram language models over surface word forms, we exploit structural information and various linguistic features inherent in the dependency representations to constrain the generation space and improve the generation quality. A series of experiments shows that dependency-based n-gram models generalise well to different languages (English and Chinese) and representations (LFG and CoNLL). Compared with state-of-the-art generation systems, our general-purpose sentence realiser is highly competitive with the added advantages of being simple, fast, robust and accurate.
(Received March 29 2010)
(Revised September 15 2010)
(Accepted October 25 2010)
(Online publication November 29 2010)