Atanu Sarkar

Results 1 issues of Atanu Sarkar

I have an LLM ready and deployed on a server. Can I use my apis to get the response instead of openAI.llm()?