agents icon indicating copy to clipboard operation
agents copied to clipboard

Use a custom LLM

Open webdevatn opened this issue 1 year ago • 3 comments

I have an LLM ready and deployed on a server. Can I use my apis to get the response instead of openAI.llm()?

webdevatn avatar Sep 03 '24 14:09 webdevatn

The OpenAI LLM allows for custom host and http session if your LLM is API-compatible with openai

keepingitneil avatar Sep 03 '24 17:09 keepingitneil

please see minimal_assistant.py as example

lenage avatar Sep 06 '24 16:09 lenage

Yes you can! I had the same problem and solved it thourgh developing a own additional LLM module. and I also passed through the language. It's not the beste way but it worked for me. SCR-20240909-aj

+1 for livekit devs to implement this

ChrisFeldmeier avatar Sep 08 '24 20:09 ChrisFeldmeier