TheWingAg90

Results 4 comments of TheWingAg90

> Which version of PyTorch do you have? Could you try using the [nightly version](https://pytorch.org/get-started/locally/)? wao. wonderfull. i fix by install "new comfyui". old your comfyui, my py.torch is 2.3....

thanks. in my country, it is 12h41. i will try on tomorrrow, g9. :D have a nice day

> Does Torch 2.8 work with CUDA 12.6? in my case, pytorch version: 2.6.0+cu126 Set vram state to: NORMAL_VRAM Device: cuda:0 NVIDIA GeForce RTX 3090 : native Using sage attention...

> > [@deadman3000](https://github.com/deadman3000) can you try what [@TheWingAg90](https://github.com/TheWingAg90) suggested? > > > And i thinks "SM89 error" when u chose "..fp8_cuda". u can chose fp16 cuda or fp16 triton >...