Ivan Oseledets

Results 3 issues of Ivan Oseledets

It is time to use @ for matrix-by-matrix and matrix-by-vector product. And leave * for the elementwise product

enhancement

Implement an (approximate) tensor by tensor product over some indices; tentative syntax is ``` matlab c = tenmul(a,b,ind1,ind2,1e-8); ``` where ind1,ind2 are the indices over which the summation is done.

Thanks for the library! Now the code takes an nn.Module and visualizes the forward pass; Inside, it explicitly uses torch.no_grad. However, if the forward pass of a module has autograd.grad...