Add transposed case for at::convolution
Here's transposed support for convolution, both strided and unstrided.
Can you squash the commits
@gpetters94 Could you please resolve the conflicts?
Done.
Can we approve and merge this PR? @vivekkhandelwal1 @silvasean
It looks like the shape function is not correct (failing PyTorch tests), see https://github.com/pytorch/pytorch/pull/80860
Oh, sorry, it is an "incorrect shape compute mapping function schema name" not incorrect shape function. Yes, we in general wait for these change to land upstream. Is there an urgent reason to merge it now?
We are working on the transposed convolution on the Torch2MHLO lowering side so checked this PR to see whether we can merge it. It's ok to wait the upstream PR to be merged though.
It's holding up U-Net support, but it isn't the only thing blocking that.
The PR on the PyTorch side is accepted. Can we merge this PR now? @vivekkhandelwal1 @silvasean
@ZihengJiang @silvasean I made a small change in the shape logic (I forgot that two of the dims needed transposing) so I need to make another PR in Pytorch that will hopefully not take long to get upstreamed this time. Otherwise this is passing all the tests.
The PR on the PyTorch side is accepted. Can we merge this PR now? @vivekkhandelwal1 @silvasean
Sure. Once this PR (https://github.com/pytorch/pytorch/pull/83557) is merged, we can get this patch merged.
@vivekkhandelwal1 @silvasean The upstream shape code is merged, so this one should be good to go.
@vivekkhandelwal1 Done.