SaturMars

Results 5 comments of SaturMars

> > I added `--gpu-device-id` : [74ff4a9](https://github.com/lllyasviel/stable-diffusion-webui-forge/commit/74ff4a9ba95ad1d92d48f7bac5f0de8c0c15e398) > > great, how can I use it so it can able to use two graphics card? `--gpu-device-id:0,1` ? --gpu-device-id 0 or --gpu-device-id...

如果没装llama_cpp哪怕是设置里关了大模型也会报这个错, 简单改法可以把old_six_prompt.py 172行改成下面这样: transObj['llmName']=postData.get('llmName','') 懒得提PR了。。

I have created a branch that initially supports version 1.8 and submitted a PR. If you need it urgently, you can use it at the following url, or you can...

> > I have created a branch that initially supports version 1.8 and submitted a PR. If you need it urgently, you can use it at the following url, or...

> workflow: > > [ComfyUI_00626_ (2).json](https://github.com/user-attachments/files/24322978/ComfyUI_00626_.2.json) SEE HERE https://github.com/kohya-ss/musubi-tuner/blob/main/docs/zimage.md#converting-lora-weights-to-comfyui-format--lora%E9%87%8D%E3%81%BF%E3%82%92comfyui%E5%BD%A2%E5%BC%8F%E3%81%AB%E5%A4%89%E6%8F%9B%E3%81%99%E3%82%8B You need convert lora to comfyui format first...