hyprnote icon indicating copy to clipboard operation
hyprnote copied to clipboard

Support model selection for custom LLM endpoint

Open yujonglee opened this issue 9 months ago • 4 comments

Should use /models endpoint for listing / selecting model.

Any OpenAI-compatible server should support this.

yujonglee avatar Apr 24 '25 05:04 yujonglee

@sammcj please add more details or proposal if you have some!

yujonglee avatar Apr 24 '25 05:04 yujonglee

For LMStudio, specifing model param is not required, but it can lazy load model if we provide one.

yujonglee avatar Apr 24 '25 05:04 yujonglee

Hey, Is it allowed to include a custom API key field in this setup, specifically for use cases like OpenRouter or OpenAI?

amirrezasalimi avatar Apr 25 '25 14:04 amirrezasalimi

@amirrezasalimi Currently, we only have api_base. Please open a separate issue for api_key support! (Also, provide more context on your use case.)

The initial reason for the current state is that we use local STT with Whisper, so supporting a local AI endpoint like Ollama without an API key aligns with that.

yujonglee avatar Apr 25 '25 14:04 yujonglee

Should be available from v0.0.25.

yujonglee avatar Apr 26 '25 14:04 yujonglee