Mansu Kim
Mansu Kim
똑같은 현상이 일어납니다. 혹시 서버가 켜져있지 않다면 켜주시면 감사하겠습니다!
+If uses package scope fixture with `autouse=True`, It works like session scope. We must use import the fixture function in conftest file.
@bhack I tried to install exactly as written in the README But uses `pip install .` instead of `pip install -e .`. because In my case, `pip install -e .`...
Thank you for pointing out what I missed
Compared to flash attention, trtion initialize tensor using torch.zeros instead of torch.empty. torch.zeros on large tensor sizes takes a lot of time. Nevertheless, backward itself is a bit slower.