Improving Neural Language Models with a Continuous Cache

Edouard GraveArmand JoulinNicolas Usunier

   Papers with code   Abstract  PDF

We propose an extension to neural network language models to adapt their prediction to the recent history. Our model is a simplified version of memory augmented networks, which stores past hidden activations as memory and accesses them through a dot product with the current hidden activation... (read more)

Benchmarked Models

No benchmarked models yet. Click here to submit a model.