Anwaar Khalid
Anwaar Khalid
The same example works with `factorization = 'tucker' ` but the `from_conv` function does not infer `dilation` from the input conv layer. Example Code: ```python import torch import tltorch test_input...
> Hey @program-20 ! Thanks for opening this PR! Is there an open ToDo list issue for the torch frontend that contains the `selu` function? If yes, could you please...
> hey @hello-fri-end i made all changes that are required. and all test are passing in my local. now you can check is there any changes required. > > Now...
> hey @hello-fri-end now i assign the subtask to me and solved the merge. i implemented just to call the method from paddle frontend instande of calling from ivy frontend....
> let me know once you're done with the changes, as I suppose this review request may not be intended. Thanks @hello-fri-end 🙂 oh, all done here actually 😅
> 1. could make the implementation a bit cleaner, because the intersection of unsupported dtypes for the backend-specific implementation and the compositional implementation wouldn't necessarily be the correct set of...
> > > 1. could make the implementation a bit cleaner, because the intersection of unsupported dtypes for the backend-specific implementation and the compositional implementation wouldn't necessarily be the correct...
So I spent some time thinking about the possible solutions we discussed. Here's a summary: 1. Run the primary implementation first, if it fails due to dtype error, run the...
Hi guys @VedPatwardhan @CatB1t! Here's a summary of changes I have made till now: - Added private helper functions in` ivy/data, ivy/general, ivy/device` to handle for mixed functions correctly. For...
> Hey @hello-fri-end, I just wanted to ask if you've explored a way to call the same function twice recursively if it's a mixed function (once for the compositional one...