A Fully Attention-Based Information Retriever

Alvaro Henrique Chaim CorreiaJorge Luiz Moreira SilvaThiago de Castro MartinsFabio Gagliardi Cozman

   Papers with code   Abstract  PDF

Recurrent neural networks are now the state-of-the-art in natural language processing because they can build rich contextual representations and process texts of arbitrary length. However, recent developments on attention mechanisms have equipped feedforward networks with similar capabilities, hence enabling faster computations due to the increase in the number of operations that can be parallelized... (read more)

Benchmarked Models

No benchmarked models yet. Click here to submit a model.