FBR_65

Results 3 issues of FBR_65

It's not possible to fetch the Embedding Model because of the Proxy/Firewall. Where to put the Model manualy? Or use the configured Proxy.

Hi, the Error is thrown in line 61 litellm.py: _____________________________________________________________ results.append(result["choices"][0]["message"]["content"]) 'CustomStreamWrapper' object is not subscriptable ____________________________________________________________ Call is from txtai.pipeline import LLM MODEL_NAME = "huggingface/TheBloke/leo-hessianai-70B-chat-GPTQ" llm = LLM(path=MODEL_NAME,method="litellm", api_base=api_base,stream=True)...

### Describe the bug The solution offered in "how to delete or hidden the page of use gradio default footer container #6696" doesn't work any more ### Have you searched...

bug