Liu Sida
Liu Sida
@Capo7 When sending a POST request to the "/prompt" API, please include the attribute `"front": true` in your request payload, as shown in the example below: 
After a little bit digging, I found this function is all you need to release all the resources. https://github.com/comfyanonymous/ComfyUI/blob/30abc324c2f73e6b648093ccd4741dece20be1e5/comfy/model_management.py#L842 but it's still a question how to call this function in...
When a user close the window, the websocket connection will be closed, and this line will be reached: https://github.com/comfyanonymous/ComfyUI/blob/30abc324c2f73e6b648093ccd4741dece20be1e5/server.py#L117 But I don't see any clean up for that user, so...
I did a PR, trying to free up vram when idle. @Lucas-BLP . But I don't anticipate that it'll get reviewed soon, given there're so many PRs waiting in the...
> Actually, your solution is nice because if I understand correctly the queue is empty the VRAM will free up (in anyway models still remains in RAM so the load...
> This worked perfectly for us: #3229 > > Simply call the route `/free`, to unload models and/or free memory. Pretty sure this issue should be closed as it is...
@pixelass , A cool extension! And pure js solution is very attractive. Is there a way to automatically call this when the user simply close the browser and leave? People...
Hi @Mohsyn , I just did a `git pull`, and my ComfyUI doesn't have this problem. Maybe try updating your ComfyUI? Also, consider using a different browser or Incognito mode...
> Using your modified code, I found that after the operation was completed, the running memory was not effectively released. It also checks for whether the ws connections are all...
or, maybe at least, as an alternative, we can add callback functions after processing queue and after ws connection closed, so that custom nodes can properly do the clean up...