CodeGPT icon indicating copy to clipboard operation
CodeGPT copied to clipboard

Support Prompt Caching for Anthropic Provider

Open Tafkas opened this issue 1 year ago • 1 comments

Describe the need of your request

Prompt Caching is a powerful feature that optimizes the API usage by allowing resuming from specific prefixes in your prompts. This approach significantly reduces processing time and costs for repetitive tasks or prompts with consistent elements.

While cache write tokens are 25% more expensive than base input tokens, cache read tokens are 90% cheaper than base input tokens.

Proposed solution

Allow optional prompt caching for the Anthropic provided models. And maybe for the Anthropic models provided by the CodeGPT provider.

Additional context

Source: https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching

Tafkas avatar Aug 21 '24 09:08 Tafkas

It would be essential to have this feature. Since we deal with a large amount of input tokens.

cantalupo555 avatar Aug 25 '24 14:08 cantalupo555