localGPT icon indicating copy to clipboard operation
localGPT copied to clipboard

Mistral not supported

Open testercell opened this issue 2 years ago • 2 comments

I'm trying to use the following as the model id and base name MODEL_ID = "TheBloke/Mistral-7B-Instruct-v0.1-GPTQ" MODEL_BASENAME = "wizardLM-7B-GPTQ-4bit.compat.no-act-order.safetensors"

But when runing run_localgpt.py i get the following error \miniconda3\Lib\site-packages\auto_gptq\modeling_utils.py", line 147, in check_and_get_model_type raise TypeError(f"{config.model_type} isn't supported yet.") TypeError: mistral isn't supported yet.

Any help is super appreciated!!

testercell avatar Mar 29 '24 21:03 testercell

What is your OS? I set the following

MODEL_ID = "TheBloke/Mistral-7B-Instruct-v0.1-GPTQ"
MODEL_BASENAME = "model.safetensors"

and got this

logging.INFO("GPTQ models will NOT work on Mac devices. Please choose a different model.")
TypeError: 'int' object is not callable

FinlandBreakfast avatar Apr 04 '24 04:04 FinlandBreakfast

if you are using cuda use GPTQ model or you are on mac use GGUF https://youtu.be/ASpageg8nPw?t=74

Bhavya031 avatar Apr 15 '24 19:04 Bhavya031