Nikita23526

Results 11 comments of Nikita23526

hey when i am clicking on deploy nothing is happening and also while asking ques it is showing failed to create task ![Image](https://github.com/user-attachments/assets/e2e76c4b-6081-449b-a4c7-797a1c051a7c)

@ravenizzed I have used ollama mistral but i was unable to understand that do i need to put mistral api key ....2025-03-19 11:27:32 wren-ai-service-1 | File "/src/globals.py", line 49, in...

@ravenizzed As mistral is supported so i think we can use it and yes i have used most updated config file yet facing the issue

i have provided you ...yesterday i have connected it with mysql with database having single table and provided few question it responded but today i have made a database with...

i am using gemini-flash-2.0 and have provided the api key in .env file also

can you help me out to set up pipeline as i think i have followed the instructions correctly but still same issue "Failed to create task"

![Image](https://github.com/user-attachments/assets/0d238c9a-9822-4471-adfc-fc530b4bc9cd) why i am getting this

2025-03-19 11:27:32 wren-ai-service-1 | File "/src/globals.py", line 49, in create_service_container 2025-03-19 11:27:32 wren-ai-service-1 | **pipe_components["semantics_description"], 2025-03-19 11:27:32 wren-ai-service-1 | ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-03-19 11:27:32 wren-ai-service-1 | KeyError: 'semantics_description' 2025-03-19 11:27:32 wren-ai-service-1 |...

@cyyeh # ------------------------------- # LLM Configuration (Ollama Mistral) # ------------------------------- type: llm provider: ollama timeout: 120 models: - model: mistral api_base: http://host.docker.internal:11434 # Use http://localhost:11434 if running outside Docker kwargs:...