CBAM: Convolutional Block Attention Module

Sanghyun WooJongchan ParkJoon-Young LeeIn So Kweon

   Papers with code   Abstract  PDF

We propose Convolutional Block Attention Module (CBAM), a simple yet effective attention module for feed-forward convolutional neural networks. Given an intermediate feature map, our module sequentially infers attention maps along two separate dimensions, channel and spatial, then the attention maps are multiplied to the input feature map for adaptive feature refinement... (read more)

Benchmarked Models

RANK
MODEL
REPO
CODE RESULT
PAPER RESULT
ε-REPRODUCED
BUILD
1
CBAM-ResNet-50
77.6%
--