Sabrina J. Mielke

Sabrina J. Mielke

Spell Once, Summon Anywhere: A Two-Level Open-Vocabulary Language Model


tl;dr: find the paper on arXiv.

We show how the spellings of known words can help us deal with unknown words in open-vocabulary NLP tasks. The method we propose can be used to extend any closed-vocabulary generative model, but in this paper we specifically consider the case of neural language modeling. Our Bayesian generative story combines a standard RNN language model (generating the word tokens in each sentence) with an RNN-based spelling model (generating the letters in each word type). These two RNNs respectively capture sentence structure and word structure, and are kept separate as in linguistics. By invoking the second RNN to generate spellings for novel words in context, we obtain an open-vocabulary language model. For known words, embeddings are naturally inferred by combining evidence from type spelling and token context. Comparing to baselines (including a novel strong baseline), we beat previous work and establish state-of-the-art results on multiple datasets.

Figure 1 from the paper, showing the generative model.


Bits per character (lower is better) on the dev and test set of WikiText-2 for our model and baselines, where full refers to our main proposed model and HCLM and HCLMcache refer to Kawakami et al.'s proposed models. All our hybrid models use a vocabulary size of 50000, pure-bpe uses 40000 merges. All pairwise differences except for those between pure-bpe, uncond, and sep-reg are statistically significant (paired permutation test over all 64 articles in the corpus, p < 0.011).

words w/ count0[1,100)[100;infty)allall

Samples from the speller

Take an in-vocabulary word and sample a spelling from pspell, conditioned on that word's embedding. How close is it to the actual spelling of the word?

original word   sampled spelling
grounded stipped
differ coronate
Clive Dickey
Southport Strigger
Carl Wuly
Chants Tranquels
valuables migrations