ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

Import failed Searge LLM for ComyUI v1.0 - Custom Node ID 93

Open aprisma2008 opened this issue 1 year ago • 15 comments

Your question

Hello, great work and thanx a lot for that. Searge LLM runes without problem in my portable version etc. But when I use the ComfyUI Setup File and after I try to install I get erros.

Thank you for all your help

Logs

Checkpoint files will always be loaded safely.
Total VRAM 12282 MB, total RAM 32478 MB
pytorch version: 2.6.0+cu124
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 4070 Ti : cudaMallocAsync
Using pytorch attention
ComfyUI version: 0.3.13
[Prompt Server] web root: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\web_custom_versions\desktop_app
Traceback (most recent call last):
  File "C:\Users\apris\Documents\ComfyUI\custom_nodes/ComfyUI_Searge_LLM\Searge_LLM_Node.py", line 13, in <module>
    Llama = importlib.import_module("llama_cpp_cuda").Llama
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\apris\miniconda3\Lib\importlib\__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1324, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'llama_cpp_cuda'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\nodes.py", line 2110, in load_custom_node
    module_spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 995, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "C:\Users\apris\Documents\ComfyUI\custom_nodes/ComfyUI_Searge_LLM\__init__.py", line 1, in <module>
    from .Searge_LLM_Node import *
  File "C:\Users\apris\Documents\ComfyUI\custom_nodes/ComfyUI_Searge_LLM\Searge_LLM_Node.py", line 15, in <module>
    Llama = importlib.import_module("llama_cpp").Llama
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\apris\miniconda3\Lib\importlib\__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ModuleNotFoundError: No module named 'llama_cpp'

Cannot import C:\Users\apris\Documents\ComfyUI\custom_nodes/ComfyUI_Searge_LLM module for custom nodes: No module named 'llama_cpp'
### Loading: ComfyUI-Manager (V2.55.3)
### ComfyUI Revision: UNKNOWN (The currently installed ComfyUI is not a Git repository)

Import times for custom nodes:
   0.0 seconds: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\websocket_image_save.py
   0.0 seconds (IMPORT FAILED): C:\Users\apris\Documents\ComfyUI\custom_nodes/ComfyUI_Searge_LLM
   0.0 seconds: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager

Starting server

To see the GUI go to: http://127.0.0.1:8000
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\1742899825_extension-node-map.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\1514988643_custom-node-list.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\746607195_github-stats.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\1742899825_extension-node-map.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\1742899825_extension-node-map.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\1514988643_custom-node-list.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\746607195_github-stats.json [DONE]
FETCH DATA from: C:\Users\apris\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\.cache\1742899825_extension-node-map.json [DONE]

Other

No response

aprisma2008 avatar Feb 06 '25 16:02 aprisma2008

@aprisma2008 You are missing llama_cpp_python, a dependency that Searge LLM needs to run the models.

The link provided by Searge node can install it for Python 3.11, but the desktop app is using Python 3.12, so it's not compatible.

But you can manually install it. Check the ComfyUI desktop user guide for more information.

Open the integrated terminal and use this command to install the pre-compiled wheel (it's for Python3.12+Windows+CUDA only):

python -m pip install https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl

LukeG89 avatar Feb 06 '25 16:02 LukeG89

Have you managed on installing llama-cpp-python?

LukeG89 avatar Feb 09 '25 17:02 LukeG89

Have you managed to install llama-cpp-python? Since he didn't, answer, I will - the command line worked and now Searge LLM works - thanks!

Fiveosix avatar Feb 18 '25 18:02 Fiveosix

Thank you! Worked for me as well.

AiInspiration avatar Apr 12 '25 14:04 AiInspiration

Thanks for this! Where do we run this command? In the cmd for the python_embedded folder? Or in the custom_nodes/comfyui_searge_llm folder?

bluetimejt avatar Apr 16 '25 11:04 bluetimejt

Sorry didn't see. I use now 2 Installations of ComfyUI. It's a temp solution

aprisma2008 avatar Apr 17 '25 22:04 aprisma2008

Thanks for this! Where do we run this command? In the cmd for the python_embedded folder? Or in the custom_nodes/comfyui_searge_llm folder?

Use this command in the python_embedded folder

.\python.exe -m pip https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl

Must use .\python.exe not python.exe

ltdrdata avatar Apr 18 '25 00:04 ltdrdata

Thanks for this! Where do we run this command? In the cmd for the python_embedded folder? Or in the custom_nodes/comfyui_searge_llm folder?

Use this command in the python_embedded folder

.\python.exe -m pip https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl

Must use .\python.exe not python.exe

Could you please share a link or wheels for Cuda128 with Python 3.12 or 3.13? We 50 Series (Blackwell owners) are suffering.

5olito avatar May 01 '25 12:05 5olito

Thanks for this! Where do we run this command? In the cmd for the python_embedded folder? Or in the custom_nodes/comfyui_searge_llm folder?

Use this command in the python_embedded folder

.\python.exe -m pip https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl

Must use .\python.exe not python.exe

Could you please share a link or wheels for Cuda128 with Python 3.12 or 3.13? We 50 Series (Blackwell owners) are suffering.

Unfortunately, they do not provide a prebuilt wheel for cu128 yet. For now, the best option is to request it on that repository. https://github.com/abetlen/llama-cpp-python/releases

ltdrdata avatar May 02 '25 01:05 ltdrdata

Running the above command gives this:

.\python.exe -m pip https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl ERROR: unknown command "https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl"

amarssadal avatar May 03 '25 11:05 amarssadal

Running the above command gives this:

.\python.exe -m pip https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl ERROR: unknown command "https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl"

There is no install command there.

Do this: .\python.exe -m pip install https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl

doubletwisted avatar May 13 '25 09:05 doubletwisted

Thank you very much. It worked with the recommended command line.

adrians78 avatar May 25 '25 17:05 adrians78

Hello works. Thank you. Go on with your great work please.

aprisma2008 avatar May 25 '25 22:05 aprisma2008

@aprisma2008 You are missing llama_cpp_python, a dependency that Searge LLM needs to run the models.

The link provided by Searge node can install it for Python 3.11, but the desktop app is using Python 3.12, so it's not compatible.

But you can manually install it. Check the ComfyUI desktop user guide for more information.

Open the integrated terminal and use this command to install the pre-compiled wheel (it's for Python3.12+Windows+CUDA only):

python -m pip install https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.4-cu124/llama_cpp_python-0.3.4-cp312-cp312-win_amd64.whl

This worked for me

jimijimi5009 avatar Jun 14 '25 11:06 jimijimi5009

I have a 5070 running desktop comfyui, the commands works the custom node is loaded, but comfyui crash immediatly when using seargeLLM node.

hugodu21 avatar Jun 18 '25 06:06 hugodu21