Results 1 issues of liran

pytorch is now support flash attention v2, which is 2 times faster than flash attention: https://pytorch.org/blog/pytorch2-2/ So I'm wondering if tensorrt 9.2 already support flash attention v2, or I have...

triaged