Feat: cache_dir
Small QOL change, adding a cache_dir argument to the input of from_pretrained.
In addition, removes the model after test_model runs, this reduces clutter, but may slow down frequent testing. (This can be removed or modified)
Fixes #132
This is much needed, as ctransformers doesn't seem to respect the TRANSFORMERS_CACHE environment variable right now.
Noting that one can do this with the HF transformers library:
`from transformers import AutoModelForCausalLM, cache
# Set the cache directory to the location where you've copied the model cache_dir = "/path/to/default_cache_directory/"
# Load the model without local_files_only=True llm = AutoModelForCausalLM.from_pretrained( "model-name", # Provide the correct model name or path cache_dir=cache_dir )`
Would make sense to match HF parameters as well as the "cache" import from "transformers".
Any reason why this hasn't been merged?
As a temporary workaround, I created a symbolic link from ~/.cache to the directory where I wanted my cache to be stored (on a different disk).
ln -s /path/to/new/.cache ~/.cache