Import failed Searge LLM for ComyUI v1.0 - Custom Node ID 93
Your question
Hello, great work and thanx a lot for that. Searge LLM runes without problem in my portable version etc. But when I use the ComfyUI Setup File and after I try to install I get erros.
Thank you for all your help
Logs
Checkpoint files will always be loaded safely.
Total VRAM 12282 MB, total RAM 32478 MB
pytorch version: 2.6.0+cu124
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 4070 Ti : cudaMallocAsync
Using pytorch attention
ComfyUI version: 0.3.13
[Prompt Server] web root: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\web_custom_versions\desktop_app
Traceback (most recent call last):
File "C:\Users\apris\Documents\ComfyUI\custom_nodes/ComfyUI_Searge_LLM\Searge_LLM_Node.py", line 13, in <module>
Llama = importlib.import_module("llama_cpp_cuda").Llama
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\apris\miniconda3\Lib\importlib\__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
File "<frozen importlib._bootstrap>", line 1324, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'llama_cpp_cuda'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\nodes.py", line 2110, in load_custom_node
module_spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 995, in exec_module
File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
File "C:\Users\apris\Documents\ComfyUI\custom_nodes/ComfyUI_Searge_LLM\__init__.py", line 1, in <module>
from .Searge_LLM_Node import *
File "C:\Users\apris\Documents\ComfyUI\custom_nodes/ComfyUI_Searge_LLM\Searge_LLM_Node.py", line 15, in <module>
Llama = importlib.import_module("llama_cpp").Llama
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\apris\miniconda3\Lib\importlib\__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ModuleNotFoundError: No module named 'llama_cpp'
Cannot import C:\Users\apris\Documents\ComfyUI\custom_nodes/ComfyUI_Searge_LLM module for custom nodes: No module named 'llama_cpp'
### Loading: ComfyUI-Manager (V2.55.3)
### ComfyUI Revision: UNKNOWN (The currently installed ComfyUI is not a Git repository)
Import times for custom nodes:
0.0 seconds: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\websocket_image_save.py
0.0 seconds (IMPORT FAILED): C:\Users\apris\Documents\ComfyUI\custom_nodes/ComfyUI_Searge_LLM
0.0 seconds: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager
Starting server
To see the GUI go to: http://127.0.0.1:8000
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\1742899825_extension-node-map.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\1514988643_custom-node-list.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\746607195_github-stats.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\1742899825_extension-node-map.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\1742899825_extension-node-map.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\1514988643_custom-node-list.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\746607195_github-stats.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\1742899825_extension-node-map.json [DONE]
Other
No response
@aprisma2008 You are missing llama_cpp_python, a dependency that Searge LLM needs to run the models.
The link provided by Searge node can install it for Python 3.11, but the desktop app is using Python 3.12, so it's not compatible.
But you can manually install it. Check the ComfyUI desktop user guide for more information.
Open the integrated terminal and use this command to install the pre-compiled wheel (it's for Python3.12+Windows+CUDA only):
python -m pip install https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl
Have you managed on installing llama-cpp-python?
Have you managed to install llama-cpp-python? Since he didn't, answer, I will - the command line worked and now Searge LLM works - thanks!
Thank you! Worked for me as well.
Thanks for this! Where do we run this command? In the cmd for the python_embedded folder? Or in the custom_nodes/comfyui_searge_llm folder?
Sorry didn't see. I use now 2 Installations of ComfyUI. It's a temp solution
Thanks for this! Where do we run this command? In the cmd for the python_embedded folder? Or in the custom_nodes/comfyui_searge_llm folder?
Use this command in the python_embedded folder
.\python.exe -m pip https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl
Must use .\python.exe not python.exe
Thanks for this! Where do we run this command? In the cmd for the python_embedded folder? Or in the custom_nodes/comfyui_searge_llm folder?
Use this command in the python_embedded folder
.\python.exe -m pip https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whlMust use
.\python.exenotpython.exe
Could you please share a link or wheels for Cuda128 with Python 3.12 or 3.13? We 50 Series (Blackwell owners) are suffering.
Thanks for this! Where do we run this command? In the cmd for the python_embedded folder? Or in the custom_nodes/comfyui_searge_llm folder?
Use this command in the python_embedded folder
.\python.exe -m pip https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whlMust use
.\python.exenotpython.exeCould you please share a link or wheels for Cuda128 with Python 3.12 or 3.13? We 50 Series (Blackwell owners) are suffering.
Unfortunately, they do not provide a prebuilt wheel for cu128 yet. For now, the best option is to request it on that repository. https://github.com/abetlen/llama-cpp-python/releases
Running the above command gives this:
.\python.exe -m pip https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl ERROR: unknown command "https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl"
Running the above command gives this:
.\python.exe -m pip https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl ERROR: unknown command "https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl"
There is no install command there.
Do this:
.\python.exe -m pip install https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl
Thank you very much. It worked with the recommended command line.
Hello works. Thank you. Go on with your great work please.
@aprisma2008 You are missing
llama_cpp_python, a dependency that Searge LLM needs to run the models.The link provided by Searge node can install it for Python 3.11, but the desktop app is using Python 3.12, so it's not compatible.
But you can manually install it. Check the ComfyUI desktop user guide for more information.
Open the integrated terminal and use this command to install the pre-compiled wheel (it's for Python3.12+Windows+CUDA only):
python -m pip install https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl
This worked for me
I have a 5070 running desktop comfyui, the commands works the custom node is loaded, but comfyui crash immediatly when using seargeLLM node.