You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
nn_decomposer.py expects cur_layer_param.convolution_param.kernel_size._values to be set. This means the convolutional kernel needs to be square and kernel_h and kernel_w may not be different. Is there any special reason for this?
The text was updated successfully, but these errors were encountered:
@hahne There is no constraint on the kernel size from the algorithm perspective. It works in general (see here).
The code expects those, only because I did a lazy job to escape from a cumbersome code block of if-else-if-else to check that the kernel_size exists in kernel_size or kernel_h kernel_w . If it exists in kernel_size, we also had to check it was a list or a scalar to make it work in general.
@hahne I suspect those assert is not necessary for linear combination layer, but we need to delete kernel_h and kernel_w if they are copied from decomposed conv layer. Similar thing for stride, pad and dilation.
nn_decomposer.py expects cur_layer_param.convolution_param.kernel_size._values to be set. This means the convolutional kernel needs to be square and kernel_h and kernel_w may not be different. Is there any special reason for this?
The text was updated successfully, but these errors were encountered: