Skip to content

Commit

Permalink
formatting for README
Browse files Browse the repository at this point in the history
  • Loading branch information
anas-rz committed Feb 27, 2024
1 parent 7293916 commit cce9ec6
Showing 1 changed file with 15 additions and 1 deletion.
16 changes: 15 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,40 +11,54 @@ pip install k3-addons
- ## Layers
- ### Pooling:
- `k3_addons.layers.AdaptiveAveragePooling1D`

Multibackend Implementation of `torch.nn.AdaptiveAvgPool1d`. The results are close to PyTorch.
- `k3_addons.layers.AdaptiveMaxPooling1D`

Multibackend Implementation of `torch.nn.AdaptiveMaxPool1d`. The results are close to PyTorch.
- `k3_addons.layers.AdaptiveAveragePooling2D`

Multibackend Implementation of `torch.nn.AdaptiveAvgPool2d`. The results are close to PyTorch.
- `k3_addons.layers.AdaptiveMaxPooling2D`

Multibackend Implementation of `torch.nn.AdaptiveMaxPool2d`. The results are close to PyTorch.
- `k3_addons.layers.Maxout`

Multibackend port of `tensorflow_addons.layers.Maxout`. [Paper](https://arxiv.org/abs/1302.4389)

- #### Normalization
- `k3_addons.layers.InstanceNormalization`
specific case of `keras.layers.GroupNormalization` since

Specific case of `keras.layers.GroupNormalization` since
it normalizes all features of one channel. The Groupsize is equal to the
channel size.
- #### Attention:
- `k3_addons.layers.DoubleAttention`

[Paper](https://arxiv.org/pdf/1810.11579.pdf)
- `k3_addons.layers.AFTFull`

[An Attention Free Transformer](https://arxiv.org/pdf/2105.14103v1.pdf)
- `k3_addons.layers.ChannelAttention2D`
- `k3_addons.layers.SpatialAttention2D`
- `k3_addons.layers.ECAAttention`

[ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks](https://arxiv.org/pdf/1910.03151.pdf)
- `k3_addons.layers.ExternalAttention`

[Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks](https://arxiv.org/abs/2105.02358)
- `k3_addons.layers.ResidualAttention`

[Residual Attention: A Simple but Effective Method for Multi-Label Recognition](https://arxiv.org/abs/2108.02456)
- `k3_addons.layers.MobileViTAttention`

[Coordinate Attention for Efficient Mobile Network Design](https://arxiv.org/abs/2103.02907)
- `k3_addons.layers.BAMBlock`
- `k3_addons.layers.CBAM`
- `k3_addons.layers.MobileViTv2Attention`

[Separable Self-attention for Mobile Vision Transformers](https://arxiv.org/abs/2206.02680)

- `k3_addons.layers.ParNetAttention`
[Non-deep Networks](https://arxiv.org/abs/2110.07641)
- `k3_addons.layers.SimAM`
Expand Down

0 comments on commit cce9ec6

Please sign in to comment.