Ajay Chintala

Results 10 comments of Ajay Chintala

@krrishdholakia thanks for the PR. Would you mind adding litellm as a provider? We already support LocalAI and this can go parallel to LocalAI. See https://github.com/trypromptly/LLMStack/pull/11 for more.

@shubhamofbce yes, i believe this still is an issue

This should now be fixed with the latest release. Please reopen if you still see this issue.

This should now be fixed with the latest release.

Please use provider config and use openai provider to work with openai compatible endpoints. https://docs.trypromptly.com/providers for more.

There is now `llmstack --quiet` option to limit the output to terminal as well as `llmstack --detach` to run in detached mode.

@googlier325 can you update python version and try installing? llmstack is looking for python version > 3.9. You can check the python version installed locally by running `python3 --version`

Thanks @HyperUpscale. We will update this in the docs. @googlier325 can you try this on Ubuntu 22.04 instead?

Can you try reinstalling llmstack? We changed how we package the setup now and it should work better. Please reopen if you continue to see issues.

Please use openai provider config going forward for localai. https://docs.trypromptly.com/providers#provider-configuration for more details on how to configure openai compatible API provider config.