Skip to content

Dilated convolution compatibility with PyTorch weights #614

Answered by blakehechtman
rolandgvc asked this question in Q&A
Discussion options

You must be logged in to vote

In XLA and by extension JAX and FLAX dilation is treated as a dilation 'rate' not an 'amount' of interior padding on the kernel. As a result the default value is 1. If I am reading the colab properly, then the intended DILATION=2 is being passed as _dilation = 2 - 1 = 1 which would be an undilated convolution. Hopefully this helps resolve the issue.

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@rolandgvc
Comment options

@blakehechtman
Comment options

@rolandgvc
Comment options

Answer selected by rolandgvc
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants