Vikash
Vikash
@Shinyzenith not starting an argument here, but since you are the author of both, I find it interesting that you chose to write swhkd in Rust but NextWM in Zig....
@davidhalter I wonder what's the difference between something like https://github.com/tree-sitter/tree-sitter-python and parso ? Can't you leverage tree-sitter somehow ? It seems to be written in C and has fairly decent...
In fact, torch goes one step further and allows reduction across multiple dimensions simultaneously. This is the PR where they first added this feature https://github.com/pytorch/pytorch/pull/6152 (only for sum), others were...
Now that we have reshape implemented. What is the right way of tackling this? I don't know if it is possible to write a generic trait like `ReduceOverDims` similar to...
Nice looking forward to it. Reducing across multiple dimensions is not that big of a deal and is just a nice to have. You could always apply reduce_over_dim multiple times....
+1 for storing strides and dimensions alongside a flat array. This should lead to some clean API for reshaping(views) and reducing over arbitrary dimensions.
@jafioti @coreylowman +1 on prioritizing the slices change because this is the time to make such massive changes because once we have more users (which we will once we add...
Interesting, although PyTorch itself silently updates only the parameters involved in the computation, I know frameworks on top of it that produce a warning about unused parameters in your model....
@epwalsh any further progress on this? It seems like to be able to use TPUs we need to use xla devices from the torch_xla package. https://pytorch.org/xla/release/1.9/index.html. This would be a...
@mychele Thanks for the response and suggestion. This was just a simple example i devised to isolate the issue. I do probably need the tags for other information. I don't...