FusionNet: Fusing via Fully-Aware Attention with Application to Machine Comprehension

Hsin-Yuan HuangChenguang ZhuYelong ShenWeizhu Chen

   Papers with code   Abstract  PDF

This paper introduces a new neural structure called FusionNet, which extends existing attention approaches from three perspectives. First, it puts forward a novel concept of "history of word" to characterize attention information from the lowest word-level embedding up to the highest semantic-level representation... (read more)

Benchmarked Models

No benchmarked models yet. Click here to submit a model.