zmwv823
zmwv823
Install custom node "ComfyUI_IPAdapter_plus" or "ComfyUI-InstantID"(8GB+ VRAM) 
> That is the last version of Transformers that Transformers BLIP code works on, which is why it's pinned. A lot of people still use BLIP, and most can't run...
> The code I am using is not compatible. If you know how to PR an upgrade of the code to use high transformers that would be fine. But I...
> #393 > > Refactored BLIP! > > FYI the reason Art Venture's BLIP works is because it's literally the original BLIP code copy-pasted. It's not transformers based. **Thanks a...
确实哈,在mpv上能正常播放,potplayer就不行,哪怕调用lav也不行
> > 确实哈,在mpv上能正常播放,potplayer就不行,哪怕调用lav也不行 > > 切换lav解码应该可以正常看吧 你在滤镜哪里强制使用lav解码你试试看    尽力了,还是不行
Install **llama_cpp_python** by using whl file from **https://github.com/abetlen/llama-cpp-python/releases**. Choose the right cuda version prebuillt whl for comfyui.
Look into the console log when you start comfyui. 
**Check this folder.**  My comfyui env: - torch 2.2.2 +cuda 12.1.
> thats the error i keep getting  ## Or try install with this whl file(without cuda): - https://github.com/abetlen/llama-cpp-python/releases/download/v0.2.61/llama_cpp_python-0.2.61-cp311-cp311-win_amd64.whl - Command(on my laptop,you need to modify path): D:\AI\ComfyUI_windows_portable\python_embeded\a.exe -m pip...