ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

How do I clean install ComfyUI with xFormers (that work) and prevent anything from changing?

Open Vektor369 opened this issue 1 year ago • 16 comments

Your question

How do I completely uninstall all remnants of ComfyUI, and reinstall it with xFormers (that work with it) and then prevent ComfyUI, PyTorch, CUDA, xFormers, or anything else from updating or changing at all?

Whatever happened with the whole PyTorch 2.4.1/CUDA 12.4/xFormers/"Torch not compiled with CUDA enabled" thing has broken ComfyUI. I've tried deleting it and reinstalling it, but no matter what I do, it just freezes after listing my RAM and the PyTorch version.

I'd really just like to get it running, like I had before, and then stop absolutely anything from changing so I can actually use it for a little while without worrying every time I close it that everything's going to break again.

Thanks in advance for any help.

Logs

No response

Other

No response

Vektor369 avatar Sep 10 '24 07:09 Vektor369

Your question

How do I completely uninstall all remnants of ComfyUI, and reinstall it with xFormers (that work with it) and then prevent ComfyUI, PyTorch, CUDA, xFormers, or anything else from updating or changing at all?

Whatever happened with the whole PyTorch 2.4.1/CUDA 12.4/xFormers/"Torch not compiled with CUDA enabled" thing has broken ComfyUI. I've tried deleting it and reinstalling it, but no matter what I do, it just freezes after listing my RAM and the PyTorch version.

I'd really just like to get it running, like I had before, and then stop absolutely anything from changing so I can actually use it for a little while without worrying every time I close it that everything's going to break again.

Thanks in advance for any help.

Logs

No response

Other

No response

me too, this trash update broke everything for me also

BenDes21 avatar Sep 10 '24 08:09 BenDes21

Do you need xformers? To my knowledge it doesn't provide any speed benefit anymore since Torch has a native implementation, though possibly that's only true for my hardware (a 3060). You can just uninstall it if it's causing problems.

As for keeping the environment stable... Just don't update the virtualenv? Instead of running the updater script, you can use git pull to update ComfyUI without touching the virtualenv.

asagi4 avatar Sep 10 '24 15:09 asagi4

All I know is that I was running a simple ToonCrafter workflow just fine. The next time I started ComfyUI, a bunch of stuff updated, then I started getting the message, "Torch not compiled with CUDA enabled", and everything just went to crap. I didn't update anything, mind you. It just happened the next time I started ComfyUI.

Now, well...now I can't even get it to start, but before it broke completely, it kept running out of memory. (I have a 3070.) Same exact workflow. Same exact files. Same exact everything. That's when I noticed that it was saying xFormers wasn't being used. I'm assuming because xFormers wasn't working with the CUDA 12.4 and PyTorch 2.4.1 that had been updated. So, I just want to go back to where it was before everything updated, and prevent anything from updating again.

Vektor369 avatar Sep 10 '24 16:09 Vektor369

All I know is that I was running a simple ToonCrafter workflow just fine. The next time I started ComfyUI, a bunch of stuff updated, then I started getting the message, "Torch not compiled with CUDA enabled", and everything just went to crap. I didn't update anything, mind you. It just happened the next time I started ComfyUI.

Now, well...now I can't even get it to start, but before it broke completely, it kept running out of memory. (I have a 3070.) Same exact workflow. Same exact files. Same exact everything. That's when I noticed that it was saying xFormers wasn't being used. I'm assuming because xFormers wasn't working with the CUDA 12.4 and PyTorch 2.4.1 that had been updated. So, I just want to go back to where it was before everything updated, and prevent anything from updating again.

Just uninstall xformers.

ltdrdata avatar Sep 10 '24 16:09 ltdrdata

Pretty sure the lack of xFormers is the reason I started running out of memory, but I tried uninstalling them anyway. Still nothing. It gets this far:

F:\AI\ComfyUI>python main.py
Total VRAM 8192 MB, total RAM 32648 MB
pytorch version: 2.3.1+cu121
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3070 : cudaMallocAsync
Using pytorch cross attention

And just sits there. Mocking me. lol

Vektor369 avatar Sep 10 '24 17:09 Vektor369

You can try installing multiple versions of ComfyUI, such as the latest and xformers compatible ones

ye-pei-sheng avatar Sep 10 '24 22:09 ye-pei-sheng

I don't think it's the version of ComfyUI so much as it's the idea that PyTorch/CUDA keeps updating so xFormers no longer works. I need to know how to prevent that from happening. I only have 8GB of VRAM, so ToonCrafter needs xFormers in order to work.

Also, I've now had to resort to using the portable install because the regular install just freezes at the terminal. Also, I'm apparently on Python 3.11.9 now! (Last I checked, I was on 3.10.8.) I'm just surprised it hasn't updated me to Windows 11 yet! lol Waitaminute-- It hasn't...right...?

Vektor369 avatar Sep 10 '24 22:09 Vektor369

Try "pip install torch torchvision torchaudio xformers==0.0.28.dev895", it is a development version compatible with PyTorch 2.4.1. Outside of certain extension requiring Xformers, haven't noticed significant performance over native SDP.

mmx31 avatar Sep 10 '24 23:09 mmx31

Unfortunately, ToonCrafter is one such extension. ToonCrafter Decode is apparently a monster, and requires xFormers.

I have xFormers installed, and it appears to be playing nicely with PyTorch. I just keep running out of memory. So either I accidentally changed something somewhere along the way and just can't figure out what it is, or maybe xFormers just isn't working with the newer PyTorch as well or something.

So, there's seriously no way to prevent stuff from updating? I feel like I'm working with an unstable time-bomb. It's already cost me the ability to use ToonCrafter, to run ComfyUI natively, and so much time.

tooncrafter-xformers-required

Vektor369 avatar Sep 10 '24 23:09 Vektor369

Unfortunately, ToonCrafter is one such extension. ToonCrafter Decode is apparently a monster, and requires xFormers.

I have xFormers installed, and it appears to be playing nicely with PyTorch. I just keep running out of memory. So either I accidentally changed something somewhere along the way and just can't figure out what it is, or maybe xFormers just isn't working with the newer PyTorch as well or something.

So, there's seriously no way to prevent stuff from updating? I feel like I'm working with an unstable time-bomb. It's already cost me the ability to use ToonCrafter, to run ComfyUI natively, and so much time.

tooncrafter-xformers-required

This is indeed a tricky issue. While ComfyUI-Manager itself has measures in place to prevent torch from being reinstalled, there are occasional cases where packages with indirect dependencies attempt to change the torch version.

As a last resort, I am considering an option to always provide a frozen version of torch as an additional dependency when installing dependencies. (In this case, it might lead to some installation failures due to dependency resolution issues.)

ltdrdata avatar Sep 11 '24 02:09 ltdrdata

Well if I had to guess, ToonCrafter is your culprit. AFAIK Comfy does not update anything unless you run an update or... an extension does so on its own. If I were you I'd drop a message on the TC git or check the issues and see what the author has to say about it.

Also on 8gb card you can use xformers or torch.

Even before the move to PyTorch 2.4.x the differences are negligible (xformers was slightly faster in some environments) and 2.4.x torch is now generally faster than xformers (version 0.0.28.dev895). If you getting OOM's it is more likely due to changes with ComfyUI, TC or some other node your using.

You can try adding to your command line "--disable-smart-memory --fast --reserve-vram 0.5" or even the "--cache-classic" as a means to see if the memory gets managed differently.

Also when in a venv environment that python and other stuff is its own container, separate from whatever is on your computer, so long as your executing the command properly. So the real confusing part here is why your having problems running a fresh install. Is it just core comfy? Or are you reinstalling it, adding nodes then running it? Have you tried "--disable-all-custom-nodes" command and running it?

Looking at the ToonCrafter github it hasn't been updated (besides a logo and some links) in 4 months, not that this is a bad thing but Comfy has undergone quite a few MAJOR updates over that time and maybe there is a conflict with mem managment. Also it mentioned that users have gotten as low as 10GB VRAM usage, which implies to me that if you have 8GB and were using it before, you were lucky and unless you can re-create that exact environment as it was, you may have trouble replicating it. Since the author did update the readme 2 days ago, I'm sure he would of changed that to 8GB if that was indeed the case.

Another thought is your RAM good? I have had unexplainable things happen to me when RAM or NVMe memory was going bad. Just figured I'd mention it.

You could look at the date of the last change for TC (b1d9cf9) and then find a version of comfy from that time (71ec5b1) and install that and see how it goes.

Lastly if you don't have @ltdrdata's Manager, you should get it. Then get in the habit of making snapshots in it before making any updates. Had you done that, you could have reverted back to your last known working version(s). Plus as was mentioned it will block many nodes from running updates for things such as pytorch and xformers. Last I knew (I think it was) Reactor was another node that would pull updates/downgrades and totally break the environment if you didn't know what you were doing.

Edit: Also 32GB of RAM is pretty low, shoot for 48 GB min, ideally 64 GB+

MDMAchine avatar Sep 11 '24 03:09 MDMAchine

Thank you so much for the detailed response!

Well, I added the memory commands, and it worked! Took them back out, and it still worked. lol In fact, it’s running everything now. Even the ‘not-low-vram’ version of the workflow. So, I honestly have no idea what’s going on. I know it’s not that my computer needed restarting because I tried that. And flushed memory. Also, I haven’t restarted it today.

I have no ideas about why the fresh install isn’t working. I’ve tried a few times, and followed the installation instructions exactly. It’s just core comfy. No custom nodes. I didn’t even try to add xFormers or anything like that. Just literally followed the instructions and ran it, but it just freezes. I didn’t try it with the disable custom nodes command, though, so I’ll try that.

Yeah, to be honest, ToonCrafter is like the one thing I follow, and I’m kinda bummed it hasn’t been updated since it came out. And I know I’m fortunate to be able to run it at all on my lowly 8GB of VRAM. I do keep a very clean system though, actively go through and get rid of everything but the essentials in my startup, etc. If I had only gotten into AI like, six months before I bought this machine, I would have gotten less processor, more VRAM or something like that.

Yeah, I’ve heard before how RAM can do wacky things. I’m running Patriot’s Viper Steel, so it’s pretty good, and it checks-out okay. I do know 32GB is a little on the lower end though. I’ve been meaning to get some more.

It’s funny you should mention Snapshot in ComfyUI Manager. I had Manager installed, but didn’t know about Snapshot until after all of this went down. But how do I get it to block nodes from updating things? Like I said, I had it installed before, but everything just seemed to be updating at will, including Python! You just can’t have that kind of thing happening in any sort of working environment.

Thanks again for the response. That’s really cool.

Vektor369 avatar Sep 11 '24 21:09 Vektor369

Thank you so much for the detailed response!

Well, I added the memory commands, and it worked! Took them back out, and it still worked. lol In fact, it’s running everything now. Even the ‘not-low-vram’ version of the workflow. So, I honestly have no idea what’s going on. I know it’s not that my computer needed restarting because I tried that. And flushed memory. Also, I haven’t restarted it today.

I have no ideas about why the fresh install isn’t working. I’ve tried a few times, and followed the installation instructions exactly. It’s just core comfy. No custom nodes. I didn’t even try to add xFormers or anything like that. Just literally followed the instructions and ran it, but it just freezes. I didn’t try it with the disable custom nodes command, though, so I’ll try that.

Yeah, to be honest, ToonCrafter is like the one thing I follow, and I’m kinda bummed it hasn’t been updated since it came out. And I know I’m fortunate to be able to run it at all on my lowly 8GB of VRAM. I do keep a very clean system though, actively go through and get rid of everything but the essentials in my startup, etc. If I had only gotten into AI like, six months before I bought this machine, I would have gotten less processor, more VRAM or something like that.

Yeah, I’ve heard before how RAM can do wacky things. I’m running Patriot’s Viper Steel, so it’s pretty good, and it checks-out okay. I do know 32GB is a little on the lower end though. I’ve been meaning to get some more.

It’s funny you should mention Snapshot in ComfyUI Manager. I had Manager installed, but didn’t know about Snapshot until after all of this went down. But how do I get it to block nodes from updating things? Like I said, I had it installed before, but everything just seemed to be updating at will, including Python! You just can’t have that kind of thing happening in any sort of working environment.

Thanks again for the response. That’s really cool.

The current snapshot feature does not yet handle pip-related issues. I am approaching this aspect rather cautiously. Changing to a completely identical package dependency state can cause significant installation changes when switching snapshots, which could potentially be a very time-consuming process.

Therefore, I am considering making the application level optional when performing a restore.

ltdrdata avatar Sep 12 '24 05:09 ltdrdata

The torch repo has a release wheel of xformers that works with torch 2.4.1 pip3 install -U xformers --index-url https://download.pytorch.org/whl/cu124 should pull 0.0.28.post1

The last time I tried to just use pytorch's attention memory efficient wasn't working right on windows and it would OOM on my regular upscales. The speed wasn't all that different but that doesn't matter if I'm running out of memory on a 24GB card. If I wanted to do that I'd have just kept using my 7900XTX with Comfy :P

NeedsMoar avatar Sep 16 '24 20:09 NeedsMoar

It is best to install two versions of ComfyUI. One with python 3.10 and max. pytorch 2.2, as well as a version with python 3.11, current Pytorch 2.4 and WITHOUT xformers, as well as diffusers0.3.3 e.g. for CogVideo-X. AnimateDiff can cope with a maximum of diffusers 0.2.7. I hope I could help?

aimpowerment avatar Oct 07 '24 21:10 aimpowerment

This issue is being marked stale because it has not had any activity for 30 days. Reply below within 7 days if your issue still isn't solved, and it will be left open. Otherwise, the issue will be closed automatically.

github-actions[bot] avatar Nov 10 '24 11:11 github-actions[bot]

The torch repo has a release wheel of xformers that works with torch 2.4.1 pip3 install -U xformers --index-url https://download.pytorch.org/whl/cu124 should pull 0.0.28.post1

The last time I tried to just use pytorch's attention memory efficient wasn't working right on windows and it would OOM on my regular upscales. The speed wasn't all that different but that doesn't matter if I'm running out of memory on a 24GB card. If I wanted to do that I'd have just kept using my 7900XTX with Comfy :P

Thank you so much, it worked.

ialhabbal avatar Nov 25 '24 13:11 ialhabbal

Hello, I have RTX 2080 - 64 RAM - Core i9 and it won't let me install Xformers. Can someone help me? I've tried several options without getting the best results.

Manueldavoin-On avatar Jul 23 '25 23:07 Manueldavoin-On