devika icon indicating copy to clipboard operation
devika copied to clipboard

ollama model in docker compose

Open yongjer opened this issue 1 year ago • 5 comments

Describe the bug cannot find pulled ollama model

To Reproduce Steps to reproduce the behavior:

  1. docker compose up
  2. attach shell of ollama container in docker-compose
  3. run ollama pull gemma:2b

Expected behavior able to choose ollama model in UI

Desktop (please complete the following information):

  • OS: ubuntu 2204
  • Browser [e.g. chrome, safari]
  • Version: pull from source on April 2nd

Additional context Add any other context about the problem here.

yongjer avatar Apr 01 '24 17:04 yongjer

I am seeing same issue. not seeing anything under ollama when using docker compose up command in the devika root directory.

initially there were no models loaded in the ollama server when I first ran the docker compose up command. So, I manually loaded the phi model by running ollama pull phi in the ollama container followed by a docker compose restart in the devika root directory.

After this I am able to confirm the presence of phi model in ollama server as shown below image

Still, not seeing the option in devika UI as shown below. image

Has anyone been able to run devika using docker?

cpAtor avatar Apr 05 '24 05:04 cpAtor

Same issue on linux with deepseek coder installed on ollama via the ollama run deepseekcoder.

even tried to restart the container or run ollama on host machine, still don't show up

Volko61 avatar Apr 07 '24 00:04 Volko61

Same for me, using either docker or Python.

ADD: In my case, no model shows in the frontend.

config.toml

❯ cat config.toml
[STORAGE]
SQLITE_DB = "db/devika.db"
SCREENSHOTS_DIR = "screenshots"
PDFS_DIR = "pdfs"
PROJECTS_DIR = "projects"
LOGS_DIR = "logs"
REPOS_DIR = "repos"

[API_KEYS]
BING = "39b..."
OPENAI = "sk-..."


[API_ENDPOINTS]
BING = "https://api.bing.microsoft.com/v7.0/search"
OLLAMA = "http://myollama:11434/v1"
OPENAI = "https://myendpoint/v1"

[LOGGING]
LOG_REST_API = "true"
LOG_PROMPTS = "false"

logs

Python logs:

❯ python devika.py
24.04.09 23:39:18: root: INFO   : Initializing Devika...
24.04.09 23:39:18: root: INFO   : Initializing Prerequisites Jobs...
24.04.09 23:39:20: root: INFO   : Loading sentence-transformer BERT models...
No sentence-transformers model found with name sentence-transformers/all-MiniLM-L6-v2. Creating a new one with MEAN pooling.
24.04.09 23:39:28: root: INFO   : BERT model loaded successfully.
24.04.09 23:39:29: root: INFO   : Ollama available
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
To disable this warning, you can either:
        - Avoid using `tokenizers` before the fork if possible
        - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
To disable this warning, you can either:
        - Avoid using `tokenizers` before the fork if possible
        - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
24.04.09 23:39:30: root: INFO   : Devika is up and running!

Docker logs:

❯ dlog
devika-backend-engine  | 24.04.09 15:40:57: root: ERROR  : Failed to list Ollama models:
devika-backend-engine  | 24.04.09 15:41:04: root: INFO   : Booting up... This may take a few seconds
devika-backend-engine  | 24.04.09 15:41:04: root: INFO   : Initializing Devika...
devika-backend-engine  | 24.04.09 15:41:04: root: INFO   : Initializing Prerequisites Jobs...
devika-backend-engine  | 24.04.09 15:41:04: root: INFO   : Loading sentence-transformer BERT models...
devika-backend-engine  | 24.04.09 15:41:44: root: INFO   : BERT model loaded successfully.
devika-frontend-app    | $ vite dev --host
devika-frontend-app    | Forced re-optimization of dependencies
devika-frontend-app    |
devika-frontend-app    |   VITE v5.2.8  ready in 612 ms
devika-frontend-app    |
devika-backend-engine  |  * Serving Flask app 'devika'
devika-frontend-app    |   ➜  Local:   http://localhost:3000/
devika-backend-engine  |  * Debug mode: off
devika-frontend-app    |   ➜  Network: http://10.10.69.3:3000/
devika-frontend-app    | 3:38:40 PM [vite] ✨ new dependencies optimized: socket.io-client, xterm, xterm-addon-fit, tiktoken/lite
devika-frontend-app    | 3:38:40 PM [vite] ✨ optimized dependencies changed. reloading

screenshot

image

RexWzh avatar Apr 09 '24 15:04 RexWzh

So docker files is not fully functional yet. You can try without running docker. If ollama is running in your system it should pick it up.

ARajgor avatar Apr 16 '24 03:04 ARajgor

I think the primary challenge is with the backend services. It works when connecting locally via localhost, but can run into cross-origin problems when going through a domain name. Generally, the backend services should be designed to be accessed using URIs, for example by facilitating backend connections through endpoints like /api.

RexWzh avatar Apr 16 '24 09:04 RexWzh