guidance icon indicating copy to clipboard operation
guidance copied to clipboard

Error 'NoneType' object is not callable when loading models with LLamaCpp

Open Maro1F431 opened this issue 2 years ago • 7 comments

The bug When I try to load a model with the LlamaCpp loader, I have the following error :

Exception ignored in: <function _LlamaContext.__del__ at 0x14f643600>
Traceback (most recent call last):
  File "/Users/mathieu.tammaro/Work/Perso/AIssistant/.env/lib/python3.11/site-packages/llama_cpp/llama.py", line 422, in __del__
TypeError: 'NoneType' object is not callable
Exception ignored in: <function _LlamaModel.__del__ at 0x14f6425c0>
Traceback (most recent call last):
  File "/Users/mathieu.tammaro/Work/Perso/AIssistant/.env/lib/python3.11/site-packages/llama_cpp/llama.py", line 240, in __del__
TypeError: 'NoneType' object is not callable
Exception ignored in: <function _LlamaBatch.__del__ at 0x14f64cae0>
Traceback (most recent call last):
  File "/Users/mathieu.tammaro/Work/Perso/AIssistant/.env/lib/python3.11/site-packages/llama_cpp/llama.py", line 670, in __del__
TypeError: 'NoneType' object is not callable

To Reproduce Give a full working code snippet that can be pasted into a notebook cell or python file. Make sure to include the LLM load step so we know which model you are using.

from guidance import models, gen

# I have the error with chat and non chat models
llama_2 = models.LlamaCppChat('/models/llama-2-7b-chat.Q6_K.gguf', n_gpu_layers=-1)

llama_2 + "The smallest cats are" + gen(stop=".")

System info (please complete the following information):

  • OS : MacOS
  • Guidance Version (guidance.__version__): 0.1.2

Maro1F431 avatar Nov 20 '23 00:11 Maro1F431

Seems to be linked to this llama-cpp-python issue: https://github.com/abetlen/llama-cpp-python/issues/891

Maro1F431 avatar Nov 20 '23 00:11 Maro1F431

I get the same error on Windows and WSL/Ubuntu.

Banbury avatar Nov 20 '23 18:11 Banbury

Get the same error on MacOS with guidance 0.1.2.

ndavidson19 avatar Nov 22 '23 18:11 ndavidson19

This should be fixed by the next release of llama-cpp-python per https://github.com/abetlen/llama-cpp-python/pull/952

kddubey avatar Nov 29 '23 18:11 kddubey

It works for me now with llama-cpp-python 0.2.22.

Banbury avatar Dec 14 '23 12:12 Banbury

I have a similar issue, on mac osx and llama_cpp_python-0.2.89

renewooller avatar Aug 23 '24 04:08 renewooller

Same here

PierreCarceller avatar Sep 04 '24 07:09 PierreCarceller