CTransformers doesn't store model on right location
You can modify the location where the Hugging Face model should be stored (when using the transformers library) by setting the environment variable TRANSFORMERS_CACHE. The default location is the directory ~/.cache/huggingface/
If I use CTransformers with the TRANSFORMERS_CACHE environnement variable won't it change the download location of the files.
It would be nice if also CTransformers uses the TRANSFORMERS_CACHE or a similar environment variable to define the download location of the LLM.
I would create a PR if this feature doesn't already exist.
Same problem here.
The Hugging Face version of transformersalso supports a cache_dir - parameter in several methods, like .from_pretrained.
It seams, there is now way to set the model-cahce in any way for ctransformers. This is a problem/show-stopper, especially if you use that insight containers system, where the main storage is not allocated to the home-directories.
Any idea for a workaround?
When I pass cache_dir on the config parameter calling Ctransformers(), it throws the error:
'cache_dir' is an invalid keyword argument for from_pretrained()
which is a false affirmative.
The cache_dir parameter could be accepted on Config class here: llm.py