prr
prr copied to clipboard
prr - command-line LLM prompt runner
Anthropic changed their python sdk - making this code line outdated. https://github.com/Forward-Operators/prr/blob/3dcceb91e3b0daf93775bbdefe4f7b9a6e6fde53/prr/services/providers/anthropic/complete.py#L22 --- Would love to know if this might help - https://github.com/BerriAI/litellm ~Simple I/O library, that standardizes all the...
implement a universal way to use inference endpoints from huggingface potential issue - different prompts needed for different models (instruct vs chat, etc)
Google PaLM using chat-bison@001 in current implementation doesn't take our instruction into consideration