Recurrent Neural Network Regularization

Wojciech ZarembaIlya SutskeverOriol Vinyals

   Papers with code   Abstract  PDF

We present a simple regularization technique for Recurrent Neural Networks (RNNs) with Long Short-Term Memory (LSTM) units. Dropout, the most successful technique for regularizing neural networks, does not work well with RNNs and LSTMs... (read more)

Benchmarked Models

No benchmarked models yet. Click here to submit a model.