shanurrahman
shanurrahman
@WKaiH123 please share server logs here
docker compose logs -f llm-server From root directory You can do the same for other containers too
Ok two things to verify 1. Is your openai api key correct 2. Is openai available in the region your server is hosted
How can I provide code examples in the chat box of OpenCopilot; what format does it accept as input?
The issue at hand is related to the context limit, currently set to 255 characters. This limit can be adjusted through the dashboard folder and the llm-server /chat/send endpoint. Currently,...
Also try restarting nginx to see if it resolved the issue On Thu, Jan 11, 2024, 15:27 Leave-it-blank ***@***.***> wrote: > @gharbat > > My package.json has ***@***.***/copilot-widget": "^1.7.0", >...
@ah7255703 is this pr required ? or should we close this ?
The error originates here, and it only happens when i try to run the app with gunicorn and eventlet
I want to make sure I give you the right information. Could you explain a bit more about what you need help with? Some additional details would help me better...
@zhongchongan We can use a different model to get around this, but i need more details on how to reproduce this.
Adding correct credentials to `.env.docker` will work. It is compatible with docker too. You will find two docker files, one for mac environment, and the other for linux. Let me...