How to hold / cache a SemanticCache instance?
Hi, is there a way to hold a pre defined redis cache instance somewhere? I currently define everytime a request comes in a new one what takes way too loong. So there must be the apprich to hold it somehow? Everything i can think about is pickling it and save it but that sounds not right. What are your approaches?
Hi @wired87 one of these guide might help you out:
- https://github.com/redis-developer/redis-ai-resources/blob/main/python-recipes/semantic-cache/semantic_caching_gemini.ipynb
- https://github.com/redis/redis-vl-python/blob/main/docs/user_guide/llmcache_03.ipynb
If you want to pre-populate the cache you can do this with an initialization script one time and the records will live on in the Redis instance. When you consume the cache you don't need to redefine just connect to it and you can cache.check() as you would.
Also would be helpful to understand the runtime environment you are in here... is this a FastAPI service? Or a notebook? This will influence how you should load the caching instance