ctransformers icon indicating copy to clipboard operation
ctransformers copied to clipboard

Feat: cache_dir

Open wheynelau opened this issue 2 years ago • 3 comments

Small QOL change, adding a cache_dir argument to the input of from_pretrained.

In addition, removes the model after test_model runs, this reduces clutter, but may slow down frequent testing. (This can be removed or modified)

Fixes #132

wheynelau avatar Sep 13 '23 11:09 wheynelau

This is much needed, as ctransformers doesn't seem to respect the TRANSFORMERS_CACHE environment variable right now.

ZeroCool2u avatar Oct 11 '23 18:10 ZeroCool2u

Noting that one can do this with the HF transformers library:

`from transformers import AutoModelForCausalLM, cache

# Set the cache directory to the location where you've copied the model cache_dir = "/path/to/default_cache_directory/"

# Load the model without local_files_only=True llm = AutoModelForCausalLM.from_pretrained( "model-name", # Provide the correct model name or path cache_dir=cache_dir )`

Would make sense to match HF parameters as well as the "cache" import from "transformers".

no-skynet avatar Oct 11 '23 22:10 no-skynet

Any reason why this hasn't been merged?

As a temporary workaround, I created a symbolic link from ~/.cache to the directory where I wanted my cache to be stored (on a different disk).

ln -s /path/to/new/.cache ~/.cache

abaveja313 avatar Dec 05 '23 06:12 abaveja313