Don't we need a return statement in forward method?
I am new to Julia, but use Pytorch in python
I notice there is no return statement at the end of forward method, is it so in Julia? Please correct me if I am wrong
Just wanted to know how efficient is Calling Pytorch in Julia? Is it fast enough?
I notice there is no return statement at the end of forward method, is it so in Julia?
Yes, all expressions have default values in Julia and the return keyword is optional.
Just wanted to know how efficient is Calling Pytorch in Julia?
In most cases, the overhead of calling PyTorch is quite low in Julia. The bottleneck lies in the GPU memory management, which we have to release manually right now.
I suppose we cannot slice index pytorch tensors from Julia as of yet.. May I know when or wether this option would be available?
Yes, that's correct. So we have to resort to the API like index_select by now.
Yes, that's correct. So we have to resort to the API like
index_selectby now.
Could you please give an example for that.
I think these examples show selecting only a single value from a tensor, but I would like to select a slice of a tensor.
For example:
a = torch.rand(1,2,28,28)
a[0,1, 3:5, 3:5]
Gives me a small part of the 28x28 image let us say.
It seems we cannot slice a tensor along multiple dimensions simultaneously so far.