Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Larger strides/downsampling factors #37

Open
chanshing opened this issue Sep 3, 2020 · 2 comments
Open

Larger strides/downsampling factors #37

chanshing opened this issue Sep 3, 2020 · 2 comments

Comments

@chanshing
Copy link

chanshing commented Sep 3, 2020

First, thanks for the very nice work!

In your implementation as well as in the paper, it seems that the proposed filters (which are the binomial coefficients) are only valid for strides/downsampling factors of 2. Extrapolating from this, does it mean that I need to use the trinomial coefficients for stride 3, quadrinomial coefficients for stride 4, and so on?

By the way, you could simplify your code in downsample.py using scipy.special.binom instead of hard-coding each filter. Something like a = np.asarray([binom(filt_size-1, i) for i in range(filt_size)]) which will take care of arbitrary filt_size

@richzhang
Copy link
Contributor

Yes the width of the filter can be computed automatically based on downsampling factor. The binom filter shouldn't be used for larger filter sizes and stride 2. I'm working on larger strides

@chanshing
Copy link
Author

chanshing commented Sep 4, 2020

I implemented something for the 1D case, since I'm doing timeseries stuff mostly. The idea is to define the filter order instead of the filter size, and start from the box filter (average pooling) of width equal to the down factor. This will be order 0. Then for the higher order filters we convolve the box filter with itself order times (See http://nghiaho.com/?p=1159)

For example, for stride 2 and order 3, we start with the box filter [1, 1]. Convolve with itself once and we get [1, 2, 1] (order 1). Once more and we get [1, 3, 3, 1] (order 2). Once again and we get [1, 4, 6, 4, 1] (order 3). Example with stride 3: [1, 1, 1] -> [1, 2, 3, 2, 1] -> [1, 3, 6, 7, 6, 3, 1] -> ...

Here's more or less what I did (again, this is only 1D):

box_kernel = np.ones(factor)
kernel = np.ones(factor)
for _ in range(order):
    kernel = np.convolve(kernel, box_kernel)
kernel /= np.sum(kernel)
kernel = torch.Tensor(kernel)
self.register_buffer('kernel', kernel[None, None, :].repeat((channels, 1, 1)))

For 2D and beyond we could take the outer product of the 1D filter with itself.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants