goose icon indicating copy to clipboard operation
goose copied to clipboard

feat: add custom openai host endpoint

Open AnthonyRonning opened this issue 11 months ago • 2 comments

This is a first start at adding a custom openai-compatible endpoint. I have this working with my LiteLLM proxy.

Screenshot 2025-02-22 at 3 38 11 PM Screenshot 2025-02-22 at 3 37 47 PM

Some open issues I've caught so far:

  1. Custom models do work but the "available models" could use work. One option here is to make this list dynamic from the /models endpoint that I believe most openai-compatible APIs should have.
Screenshot 2025-02-22 at 3 42 58 PM
  1. Consequently, the "browse models" link doesn't do anything

  2. Could also add the ability to pass in the model(s) during configuring it.

  3. Could also add the ability to provide multiple custom openai-compatible models but would increase the scope a lot, I would imagine. Could be done in a follow up if there's enough demand.

I'm open to any opinions here for how to handle these issues or postpone for now.

Closes: https://github.com/block/goose/issues/885

AnthonyRonning avatar Feb 22 '25 21:02 AnthonyRonning

can we directly use the OPEN AI provider? you can set the host and api key to it

yingjiehe-xyz avatar Feb 24 '25 18:02 yingjiehe-xyz

can we directly use the OPEN AI provider? you can set the host and api key to it

Definitely possible and could be a lot more simple. Though some of the open questions about handling the models would remain. Still functional without needing to answer those questions yet, it will just require a few manual steps that might not be known to the user right off the bat.

AnthonyRonning avatar Feb 24 '25 18:02 AnthonyRonning

Hey @AnthonyRonning & @yingjiehe-xyz ! Wondering if it makes sense to generalize the card copy a bit more. Love the functionality! Something like:

H1: Custom Body: Configure custom models using additional API settings

Immediately I was slightly confused and about to go down the path of custom settings within the OpenAI provider card but understand the functionality is a bit more nuanced and would need a separate card.

spencrmartin avatar Mar 10 '25 19:03 spencrmartin

Sorry for the delay on this. I kinda like the approach of just merging it with the existing OPENAI provider. Basically just put in an optional host URL and have it read from that whenever it's making API calls? I can incorporate some of the copy suggestions above too.

AnthonyRonning avatar Mar 11 '25 15:03 AnthonyRonning

For my designer brain @AnthonyRonning, would the api key remain the same and still be available for non custom models?

spencrmartin avatar Mar 11 '25 20:03 spencrmartin

For my designer brain @AnthonyRonning, would the api key remain the same and still be available for non custom models?

Yes and for the custom host option as well

AnthonyRonning avatar Mar 12 '25 00:03 AnthonyRonning

My gut tells me thats the right move.

Configure OpenAi provider key is the same, optional additions for custom hosts etc within the provider card. Would you add multiple optional additions that there would be the need to add them in the same way you'd add environment var in extensions?

spencrmartin avatar Mar 12 '25 17:03 spencrmartin

Would you add multiple optional additions that there would be the need to add them in the same way you'd add environment var in extensions?

Could be a potential to, I'll double check what environment variables are currently there that might apply explicitly to the openai provider (with custom URL). I think for the most part there's nothing more needed to cover this use case but there could be something advanced I'm missing.

AnthonyRonning avatar Mar 12 '25 18:03 AnthonyRonning

Oh looks like it's master already! Looks good, closing this one.

Screenshot 2025-03-12 at 1 14 02 PM

AnthonyRonning avatar Mar 12 '25 18:03 AnthonyRonning