genai-stack icon indicating copy to clipboard operation
genai-stack copied to clipboard

pulling ollama model llama2 using http://host.docker.internal:11434 pull-model-1 panic: $HOME is not defined

Open mishagavron opened this issue 1 year ago • 10 comments

I am running genai-stack on my Mac and getting this error when I do: docker-compose up --build

pull-model-1 | pulling ollama model llama2 using http://host.docker.internal:11434 pull-model-1 | panic: $HOME is not defined pull-model-1 | pull-model-1 | goroutine 1 [running]: pull-model-1 | github.com/ollama/ollama/envconfig.Models() pull-model-1 | github.com/ollama/ollama/envconfig/config.go:93 +0xa9 pull-model-1 | github.com/ollama/ollama/envconfig.AsMap() pull-model-1 | github.com/ollama/ollama/envconfig/config.go:253 +0x699 pull-model-1 | github.com/ollama/ollama/cmd.NewCLI() pull-model-1 | github.com/ollama/ollama/cmd/cmd.go:1329 +0xb68 pull-model-1 | main.main() pull-model-1 | github.com/ollama/ollama/main.go:12 +0x13 pull-model-1 | panic: $HOME is not defined pull-model-1 | pull-model-1 | goroutine 1 [running]: pull-model-1 | github.com/ollama/ollama/envconfig.Models() pull-model-1 | github.com/ollama/ollama/envconfig/config.go:93 +0xa9 pull-model-1 | github.com/ollama/ollama/envconfig.AsMap() pull-model-1 | github.com/ollama/ollama/envconfig/config.go:253 +0x699 pull-model-1 | github.com/ollama/ollama/cmd.NewCLI() pull-model-1 | github.com/ollama/ollama/cmd/cmd.go:1329 +0xb68 pull-model-1 | main.main() pull-model-1 | github.com/ollama/ollama/main.go:12 +0x13 pull-model-1 exited with code 1

mishagavron avatar Aug 15 '24 12:08 mishagavron

Same here

aubrey-ford-nutrien avatar Aug 21 '24 23:08 aubrey-ford-nutrien

I removed the inline comments in my .env and it works now

image

aubrey-ford-nutrien avatar Aug 21 '24 23:08 aubrey-ford-nutrien

Same, but removing the comments on the non-comment lines in .env didn't fix it.

gingaramo avatar Aug 22 '24 06:08 gingaramo

Followed https://github.com/docker/genai-stack/issues/175, the fix was to add

(process/shell {:env {"OLLAMA_HOST" url "HOME" (System/getProperty "user.home")} :out :inherit :err :inherit} (format "bash -c './bin/ollama show %s --modelfile > /dev/null || ./bin/ollama pull %s'" llm llm))

In pull_model.Dockerfile. The run two commands:

docker compose down

docker compose up --build

gingaramo avatar Aug 23 '24 14:08 gingaramo

same issue

flyboyer avatar Aug 28 '24 07:08 flyboyer

same issue here.

popalex avatar Sep 22 '24 17:09 popalex

Followed #175, the fix was to add

(process/shell {:env {"OLLAMA_HOST" url "HOME" (System/getProperty "user.home")} :out :inherit :err :inherit} (format "bash -c './bin/ollama show %s --modelfile > /dev/null || ./bin/ollama pull %s'" llm llm))

In pull_model.Dockerfile. The run two commands:

docker compose down

docker compose up --build

Same issue here. Didn't resolve by changing pull_model.Dockerfile. .env just includes OLLAMA_BASE_URL=http://llm:11434

Is there any update?

hamedmiir avatar Oct 14 '24 09:10 hamedmiir

I got it running on Ubuntu by:

  1. updating OLLAMA_BASE_URL to the correct machine name

docker ps (note NAME) in .env file OLLAMA_BASE_URL=http://machine name:11434 (e.g. OLLAMA_BASE_URL=http://genai-stack-llm-1:11434)

  1. explicitly setting HOME in pull_model.Dockerfile
#syntax = docker/dockerfile:1.4

# Stage 1: Use ollama as the base to get the ollama binary
FROM ollama/ollama:latest AS ollama

# Stage 2: Use babashka as the main image
FROM babashka/babashka:latest

# Set the working directory to /root
WORKDIR /root

# Set the HOME environment variable explicitly
ENV HOME /root

[ continue with existing file ... ]

a8mark avatar Oct 25 '24 18:10 a8mark

should be fixed here: https://github.com/docker/genai-stack/pull/184

tomasonjo avatar Oct 30 '24 15:10 tomasonjo

I am getting the above error when installing ollama on an EC2 instance.

  • I am adding the ollama_host and ollama_models
  • Installing the Ollama using the script provided
  • Restarting the Ollama service which comes up just fine
  • Changing the directory permissions
  • And finally downloading the model which fails with the above error.
        const ollamaMountPath = `/mnt/efs/ollama`;
        const environmentConfig = new InitConfig([
            InitCommand.shellCommand(`echo 'Setting environment variables for Ollama service'`),
            InitCommand.shellCommand(`mkdir -p ${ollamaMountPath}/models`),
            InitCommand.shellCommand(`sudo mkdir -p /etc/systemd/system/ollama.service.d`),
            InitCommand.shellCommand(`echo '[Service]' | sudo tee -a /etc/systemd/system/ollama.service.d/env.conf`),
            InitCommand.shellCommand(`echo 'Environment="OLLAMA_HOST=0.0.0.0"' | sudo tee -a /etc/systemd/system/ollama.service.d/env.conf`),
            InitCommand.shellCommand(`echo 'Environment="OLLAMA_MODELS=${ollamaMountPath}/models"' | sudo tee -a /etc/systemd/system/ollama.service.d/env.conf`),
        ]);

        const ollamaInstallationConfig = new InitConfig([
            InitCommand.shellCommand(`echo 'Starting Ollama Installation'`),
            InitCommand.shellCommand(`sudo curl -fsSL https://ollama.com/install.sh | sh > /tmp/ollama-install.log 2>&1`),
            InitCommand.shellCommand(`chown -R ollama:ollama ${ollamaMountPath}`),
            InitCommand.shellCommand(`chmod 775 ${ollamaMountPath}`),
        ]);

        const ollamaServiceConfig = new InitConfig([
            InitCommand.shellCommand(`echo 'Starting Ollama Service'`),
            InitCommand.shellCommand(`systemctl daemon-reload`),
            InitCommand.shellCommand(`systemctl restart ollama`),
            InitService.enable('ollama', {
                enabled: true,
                ensureRunning: true
            }),
        ]);

        const modelToDownload = props.ollamaModel.model;
        const modelDownloadConfig = new InitConfig([
            InitCommand.shellCommand(`echo 'Downloading model ${modelToDownload}'`),
            InitCommand.shellCommand(`ollama pull ${modelToDownload}`),
        ]);

rhlarora84 avatar Nov 07 '24 17:11 rhlarora84