feat: add custom openai host endpoint
This is a first start at adding a custom openai-compatible endpoint. I have this working with my LiteLLM proxy.
Some open issues I've caught so far:
- Custom models do work but the "available models" could use work. One option here is to make this list dynamic from the
/modelsendpoint that I believe most openai-compatible APIs should have.
-
Consequently, the "browse models" link doesn't do anything
-
Could also add the ability to pass in the model(s) during configuring it.
-
Could also add the ability to provide multiple custom openai-compatible models but would increase the scope a lot, I would imagine. Could be done in a follow up if there's enough demand.
I'm open to any opinions here for how to handle these issues or postpone for now.
Closes: https://github.com/block/goose/issues/885
can we directly use the OPEN AI provider? you can set the host and api key to it
can we directly use the OPEN AI provider? you can set the host and api key to it
Definitely possible and could be a lot more simple. Though some of the open questions about handling the models would remain. Still functional without needing to answer those questions yet, it will just require a few manual steps that might not be known to the user right off the bat.
Hey @AnthonyRonning & @yingjiehe-xyz ! Wondering if it makes sense to generalize the card copy a bit more. Love the functionality! Something like:
H1: Custom Body: Configure custom models using additional API settings
Immediately I was slightly confused and about to go down the path of custom settings within the OpenAI provider card but understand the functionality is a bit more nuanced and would need a separate card.
Sorry for the delay on this. I kinda like the approach of just merging it with the existing OPENAI provider. Basically just put in an optional host URL and have it read from that whenever it's making API calls? I can incorporate some of the copy suggestions above too.
For my designer brain @AnthonyRonning, would the api key remain the same and still be available for non custom models?
For my designer brain @AnthonyRonning, would the api key remain the same and still be available for non custom models?
Yes and for the custom host option as well
My gut tells me thats the right move.
Configure OpenAi provider key is the same, optional additions for custom hosts etc within the provider card. Would you add multiple optional additions that there would be the need to add them in the same way you'd add environment var in extensions?
Would you add multiple optional additions that there would be the need to add them in the same way you'd add environment var in extensions?
Could be a potential to, I'll double check what environment variables are currently there that might apply explicitly to the openai provider (with custom URL). I think for the most part there's nothing more needed to cover this use case but there could be something advanced I'm missing.
Oh looks like it's master already! Looks good, closing this one.