Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regarding the attention map #16

Open
gemcollector opened this issue Sep 11, 2021 · 3 comments
Open

Regarding the attention map #16

gemcollector opened this issue Sep 11, 2021 · 3 comments

Comments

@gemcollector
Copy link

I noticed that you drew an attention map in the text. How do you draw an attention map? Can you provide the source code?

@MishaLaskin
Copy link
Owner

MishaLaskin commented Sep 11, 2021 via email

@longfeizhang617
Copy link

longfeizhang617 commented Jun 22, 2022

I noticed that you drew an attention map in the text. How do you draw an attention map? Can you provide the source code?

I have noticed the attention map too, have you solved this issue?

@Dongzhou-1996
Copy link

I noticed that you drew an attention map in the text. How do you draw an attention map? Can you provide the source code?

I have noticed the attention map too, have you solved this issue?

The author mentioned that their method is similar to the algorithm proposed in PAYING MORE ATTENTION TO ATTENTION:
IMPROVING THE PERFORMANCE OF CONVOLUTIONAL NEURAL NETWORKS VIA ATTENTION TRANSFER
, which simply sums the activation tensor of any convolutional layer along the channel dimension (i.e. a tensor in [BxCxWxH] -> attention map [BxWxH]). Corresponding codes can be seen in this link. The only difference between these two methods is the spatial softmax operation, I think it is added by Laskin to show the attention map better and highlight the fixation of a convolutional layer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants