LosCrossos

Results 24 comments of LosCrossos

windows linux? on windows it takes 2h for me

try mine: https://github.com/Dao-AILab/flash-attention/issues/1683

cuda is forwards and backwards compatible :) just try it. it works. you might just get a warning that cuda differs in a minor version.

@cmp-nct you are using FA3, which is not the topic of this thread and also not yet confirmed working on blackwell. FA2.x does work

hey @d8ahazard thanks for your windows fork! would you mind updating your code to the latest ssm? and also adding blackwell support to it? here is the code you can...

50xx is not yet in official torch. you need nightly torch. mamba is compiled on a torch dependency. therefore if you compile mamba yourself using the latest nightly torch code...

i had this and got it working. not with the latest version of spleeter or flask.. it seems the probleem is deeply rooted. also i had to downgrade python. i...

i have the same error.. the first error was solved by installing the Visual compiler. The Visual compiler has a C-compiler but no fortran compiler. I also installed a fortran...

same problem. **solved**: by using python 3.10 and putting this in the requirements file: `numpy

> [@tridao](https://github.com/tridao) as advised by [@malfet](https://github.com/malfet), the issue is PyTorch updated its C++ ABI in 2.6.0cu126, and it stayed this way in 2.7.0cu128: > > * [[CXX11ABI] torch 2.6.0-cu126 and...