Completely Locally
Will there be a way to run it completely locally without Claude or open ai api soon?
for example ollama instead of those OpenAI or Claude!?
We can add support through LM-Studio and Oolama
ollama is already supported
We can add support through LM-Studio and Oolama
Yeah would be good I personally only use Ollama!
ollama is already supported
I haven’t it installed it because no where on the page it were mentioned that it is supported so I had no use is there anywhere in the doc something telled about?
ollama is already supported
That says the setup information not like you also can use instead ollama so I thought it isn’t supported!: Devika requires certain configuration settings and API keys to function properly. Update the config.toml file with the following information:
OPENAI_API_KEY: Your OpenAI API key for accessing GPT models. CLAUDE_API_KEY: Your Anthropic API key for accessing Claude models. BING_API_KEY: Your Bing Search API key for web searching capabilities. DATABASE_URL: The URL for your database connection. LOG_DIRECTORY: The directory where Devika's logs will be stored. PROJECT_DIRECTORY: The directory where Devika's projects will be stored. Make sure to keep your API keys secure and do not share them publicly.
ollama is already supported
I haven’t it installed it because no where on the page it were mentioned that it is supported so I had no use is there anywhere in the doc something telled about?
It is supported by Doc hasn't mentioned it anywhere
Doesn't ollama have OpenAI compatibility.
https://ollama.com/blog/openai-compatibility
Doesn't ollama have OpenAI compatibility.
https://ollama.com/blog/openai-compatibility
Yup but for that you would need to edit the domain in the project of the OpenAI api!
Doesn't ollama have OpenAI compatibility. https://ollama.com/blog/openai-compatibility
Yup but for that you would need to edit the domain in the project of the OpenAI api!
Send PR it's few lines!
ollama is already supported
That says the setup information not like you also can use instead ollama so I thought it isn’t supported!: Devika requires certain configuration settings and API keys to function properly. Update the config.toml file with the following information:
OPENAI_API_KEY: Your OpenAI API key for accessing GPT models. CLAUDE_API_KEY: Your Anthropic API key for accessing Claude models. BING_API_KEY: Your Bing Search API key for web searching capabilities. DATABASE_URL: The URL for your database connection. LOG_DIRECTORY: The directory where Devika's logs will be stored. PROJECT_DIRECTORY: The directory where Devika's projects will be stored. Make sure to keep your API keys secure and do not share them publicly.
i think Bing_api_key is not necessary because we have a built-in python web-browser ( link and i am not that experienced and i don't have access to free tier on bing 😅 so that's why i am looking for alternatives
LLMs can be run locally via Ollama as of now but the browser interaction and search would be impossible locally, if search can be eliminated in the step, this could be run locally.
LLMs can be run locally via Ollama as of now but the browser interaction and search would be impossible locally, if search can be eliminated in the step, this could be run locally.
But search is important for most of the tasks so we cannot eliminate that.
ollama is already supported
That says the setup information not like you also can use instead ollama so I thought it isn’t supported!: Devika requires certain configuration settings and API keys to function properly. Update the config.toml file with the following information:
OPENAI_API_KEY: Your OpenAI API key for accessing GPT models. CLAUDE_API_KEY: Your Anthropic API key for accessing Claude models. BING_API_KEY: Your Bing Search API key for web searching capabilities. DATABASE_URL: The URL for your database connection. LOG_DIRECTORY: The directory where Devika's logs will be stored. PROJECT_DIRECTORY: The directory where Devika's projects will be stored. Make sure to keep your API keys secure and do not share them publicly.
CAN BE KEEP SOME OF THESE FIELDS EMPTY? LETS SAY I DONT WANT TO USE CLAUDE.