Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Self-Attention as a variant of tower+pool #30

Open
wants to merge 1 commit into
base: attention_encoder
Choose a base branch
from

Commits on Jun 23, 2019

  1. Add Self-Attention as a variant of tower+pool

    Feed-forward convolutional architecture with
    self-attention. Self-Attention as described in Self-Attention
    Generative Adversarial Networks by Zhang, Goodfellow, Metaxas, and
    Odena added to the tower+pool architecture.
    loganbruns committed Jun 23, 2019
    Configuration menu
    Copy the full SHA
    4a967e4 View commit details
    Browse the repository at this point in the history