The discrict 2D cross correlation is very similar to 2D convolution.
check by MATLAB code
M1 = [17 24 1 8 15;
23 5 7 14 16;
4 6 13 20 22;
10 12 19 21 3;
11 18 25 2 9];
M2 = [8 1 6;
3 5 7;
4 9 2];
D = xcorr2(M1, M2);
D
But there is some difference between MATLAB and PyTorch. The relevant functions in PyTorch are torch.nn.functional.conv2d and torch.nn.Conv2d. Because torch.nn.functional.conv2d can setup paddings as an argument, so the result matrix can be smaller. Let’s do a simple experiment to check it.
1 | Python 2.7.15 |Anaconda, Inc.| (default, Dec 14 2018, 19:04:19) |
We will check the value for first filter.
1 | >>> inputs |
Do the same thing on Matlab
1 | >> inputs_0 = [[-2.0126, -1.2142, -0.5943, 0.9880, 1.3990]; |
No suprisingly we will find that ans on matlab is equivalent to results[0, 0] on python.
For torch.nn.Conv2d, PyTorch add random weights to the cross correlation bigot multiplication, which make it more complex. Look at the following codes run in python terminal:
1 | >>> import torch |
Comments
shortnamefor Disqus. Please set it in_config.yml.