pulling ollama model llama2 using http://host.docker.internal:11434 pull-model-1 panic: $HOME is not defined
I am running genai-stack on my Mac and getting this error when I do: docker-compose up --build
pull-model-1 | pulling ollama model llama2 using http://host.docker.internal:11434 pull-model-1 | panic: $HOME is not defined pull-model-1 | pull-model-1 | goroutine 1 [running]: pull-model-1 | github.com/ollama/ollama/envconfig.Models() pull-model-1 | github.com/ollama/ollama/envconfig/config.go:93 +0xa9 pull-model-1 | github.com/ollama/ollama/envconfig.AsMap() pull-model-1 | github.com/ollama/ollama/envconfig/config.go:253 +0x699 pull-model-1 | github.com/ollama/ollama/cmd.NewCLI() pull-model-1 | github.com/ollama/ollama/cmd/cmd.go:1329 +0xb68 pull-model-1 | main.main() pull-model-1 | github.com/ollama/ollama/main.go:12 +0x13 pull-model-1 | panic: $HOME is not defined pull-model-1 | pull-model-1 | goroutine 1 [running]: pull-model-1 | github.com/ollama/ollama/envconfig.Models() pull-model-1 | github.com/ollama/ollama/envconfig/config.go:93 +0xa9 pull-model-1 | github.com/ollama/ollama/envconfig.AsMap() pull-model-1 | github.com/ollama/ollama/envconfig/config.go:253 +0x699 pull-model-1 | github.com/ollama/ollama/cmd.NewCLI() pull-model-1 | github.com/ollama/ollama/cmd/cmd.go:1329 +0xb68 pull-model-1 | main.main() pull-model-1 | github.com/ollama/ollama/main.go:12 +0x13 pull-model-1 exited with code 1
Same here
I removed the inline comments in my .env and it works now
Same, but removing the comments on the non-comment lines in .env didn't fix it.
Followed https://github.com/docker/genai-stack/issues/175, the fix was to add
(process/shell {:env {"OLLAMA_HOST" url "HOME" (System/getProperty "user.home")} :out :inherit :err :inherit} (format "bash -c './bin/ollama show %s --modelfile > /dev/null || ./bin/ollama pull %s'" llm llm))
In pull_model.Dockerfile. The run two commands:
docker compose down
docker compose up --build
same issue
same issue here.
Followed #175, the fix was to add
(process/shell {:env {"OLLAMA_HOST" url "HOME" (System/getProperty "user.home")} :out :inherit :err :inherit} (format "bash -c './bin/ollama show %s --modelfile > /dev/null || ./bin/ollama pull %s'" llm llm))
In pull_model.Dockerfile. The run two commands:
docker compose down
docker compose up --build
Same issue here. Didn't resolve by changing pull_model.Dockerfile. .env just includes OLLAMA_BASE_URL=http://llm:11434
Is there any update?
I got it running on Ubuntu by:
- updating OLLAMA_BASE_URL to the correct machine name
docker ps (note NAME) in .env file OLLAMA_BASE_URL=http://machine name:11434 (e.g. OLLAMA_BASE_URL=http://genai-stack-llm-1:11434)
- explicitly setting HOME in pull_model.Dockerfile
#syntax = docker/dockerfile:1.4
# Stage 1: Use ollama as the base to get the ollama binary
FROM ollama/ollama:latest AS ollama
# Stage 2: Use babashka as the main image
FROM babashka/babashka:latest
# Set the working directory to /root
WORKDIR /root
# Set the HOME environment variable explicitly
ENV HOME /root
[ continue with existing file ... ]
should be fixed here: https://github.com/docker/genai-stack/pull/184
I am getting the above error when installing ollama on an EC2 instance.
- I am adding the ollama_host and ollama_models
- Installing the Ollama using the script provided
- Restarting the Ollama service which comes up just fine
- Changing the directory permissions
- And finally downloading the model which fails with the above error.
const ollamaMountPath = `/mnt/efs/ollama`;
const environmentConfig = new InitConfig([
InitCommand.shellCommand(`echo 'Setting environment variables for Ollama service'`),
InitCommand.shellCommand(`mkdir -p ${ollamaMountPath}/models`),
InitCommand.shellCommand(`sudo mkdir -p /etc/systemd/system/ollama.service.d`),
InitCommand.shellCommand(`echo '[Service]' | sudo tee -a /etc/systemd/system/ollama.service.d/env.conf`),
InitCommand.shellCommand(`echo 'Environment="OLLAMA_HOST=0.0.0.0"' | sudo tee -a /etc/systemd/system/ollama.service.d/env.conf`),
InitCommand.shellCommand(`echo 'Environment="OLLAMA_MODELS=${ollamaMountPath}/models"' | sudo tee -a /etc/systemd/system/ollama.service.d/env.conf`),
]);
const ollamaInstallationConfig = new InitConfig([
InitCommand.shellCommand(`echo 'Starting Ollama Installation'`),
InitCommand.shellCommand(`sudo curl -fsSL https://ollama.com/install.sh | sh > /tmp/ollama-install.log 2>&1`),
InitCommand.shellCommand(`chown -R ollama:ollama ${ollamaMountPath}`),
InitCommand.shellCommand(`chmod 775 ${ollamaMountPath}`),
]);
const ollamaServiceConfig = new InitConfig([
InitCommand.shellCommand(`echo 'Starting Ollama Service'`),
InitCommand.shellCommand(`systemctl daemon-reload`),
InitCommand.shellCommand(`systemctl restart ollama`),
InitService.enable('ollama', {
enabled: true,
ensureRunning: true
}),
]);
const modelToDownload = props.ollamaModel.model;
const modelDownloadConfig = new InitConfig([
InitCommand.shellCommand(`echo 'Downloading model ${modelToDownload}'`),
InitCommand.shellCommand(`ollama pull ${modelToDownload}`),
]);