How do I setup open code with litellm?
Question
How do I setup open code with litellm?
This issue might be a duplicate of existing issues. Please check:
- #665: Improve documentation for self-hosted LLMs (specifically mentions documentation unclear for LiteLLM Proxy configuration)
Feel free to ignore if none of these address your specific case.
@harshpreet931 I've got a couple of working implementations with litellm, as well as open-webui (typically prefer to expose owui w/ litellm as a backend router). This was a bit tricky as it required explicit config settings on both ends.
Could you share which provider(s) you intend to use? I may be able to share working examples.
Also, could you share any non-sensitive data for your litellm config? there were some keys i needed to add, e.g. for github copilot and aws bedrock.
so for open code to work I would need to change config on the litellm side as well? Why is that the case? Is it not possible to do It without that?
my understanding is that changes would be necessary on a per-model basis.. so you might not need to change anything on the litellm side, so long as models are configured properly. my apologies if that was misleading... i set up both around the same time, so i likely mixed up what was needed for each solution.
I have a self hosted glm 4.7 and using it through litellm how do I connect to that?
i've not self-hosted this model. do you have it working with other frontends through litellm already? if so, could you share the relevant parts of your config?