Mahesh Sinha
Mahesh Sinha
@npow After updating the code and model, stuck with below error: Error: ` Model.__init__() missing 1 required positional argument: 'ggml_model' (type=type_error)` Code: ``` from langchain import PromptTemplate, LLMChain from langchain.llms...
Hello all, Do we have any possibility to use an inference service (like an url) instead of loading model and then use llm.
> @traverse-in-reverse any idea why I am getting below error, with same code block, which you have provided: Error: `NoIndexException: Index not found, please create an instance before querying`
ray-llm is only for deploying or inferencing llm over Ray. @qingqiuhe I don't think, it supports LLM training.
Can someone please advice, it's still the same issue ?