jman0815
Results
3
comments of
jman0815
Same problem here!
Can we please get Flash Attention 3 support for RTX6000 Blackwell GPUs? Really want to try native support for FP4 computation.
I‘m also unable to run GPT-OSS with FA3 on my RTX6000 Blackwell SE with CUDA 12.9. Will there ever be support for this?