ctransformers icon indicating copy to clipboard operation
ctransformers copied to clipboard

CTransformers doesn't store model on right location

Open Yanni8 opened this issue 2 years ago • 2 comments

You can modify the location where the Hugging Face model should be stored (when using the transformers library) by setting the environment variable TRANSFORMERS_CACHE. The default location is the directory ~/.cache/huggingface/

If I use CTransformers with the TRANSFORMERS_CACHE environnement variable won't it change the download location of the files.

It would be nice if also CTransformers uses the TRANSFORMERS_CACHE or a similar environment variable to define the download location of the LLM.

I would create a PR if this feature doesn't already exist.

Yanni8 avatar Sep 21 '23 08:09 Yanni8

Same problem here.

The Hugging Face version of transformersalso supports a cache_dir - parameter in several methods, like .from_pretrained.
It seams, there is now way to set the model-cahce in any way for ctransformers. This is a problem/show-stopper, especially if you use that insight containers system, where the main storage is not allocated to the home-directories.
Any idea for a workaround?

B0rner avatar Apr 05 '24 07:04 B0rner

When I pass cache_dir on the config parameter calling Ctransformers(), it throws the error: 'cache_dir' is an invalid keyword argument for from_pretrained() which is a false affirmative. The cache_dir parameter could be accepted on Config class here: llm.py

filipegl avatar Dec 16 '24 19:12 filipegl