LocalAI icon indicating copy to clipboard operation
LocalAI copied to clipboard

Feature: Support and/or documentation on rootless approach local-ai docker

Open DanielBoettner opened this issue 1 year ago • 1 comments

Is your feature request related to a problem? Please describe.

"The problem" I encounter is that the files like images which created are created by root. When mounting the output to the host, the files are owned by root. This can be fixed with a simple "sudo chown" manually.

I played around with "ns-remap" but had no real success so far. I also tried an approach to create a Dockerfile and use "localai/localai:latest-aio-gpu-nvidia-cuda-12" (in my case) as a base. Then adding another user. But I ran into permission issues.

I'm not familiar enough with the LocalAI to make a good guess on the effort needed to "switch" this to a non root user that could be mapped to the host user.

Describe the solution you'd like

A common approach is to allow linking USER_ID and GROUP_ID (typically 1000 and 1000) to a/the user that calls the python scripts.

Describe alternatives you've considered

No other current idea on how to solve this.

Additional context

Native Ubuntu 24.04 and a WSL Ubuntu 24.04 (on Win11)

DanielBoettner avatar Sep 27 '24 08:09 DanielBoettner

I have this working with rootless podman on Linux Mint 22. The following systemd container file is how I start/stop the service, using systemctl --user start localai as the unprivileged user.

~/.config/containers/systemd/localai.container:

[Unit]
Description=localai container
After=local-fs.target network-online.target

[Container]
AddDevice=nvidia.com/gpu=all
AutoUpdate=registry
Environment=DEBUG=true
Environment=LOCALAI_SINGLE_ACTIVE_BACKEND=true
Environment=LOCALAI_F16=true
Environment=LOCALAI_THREADS=16
Image=quay.io/go-skynet/local-ai:v2.20.1-cublas-cuda12-ffmpeg
Label=app=localai
PublishPort=5000:8080
# not needed for the localai container
# everything is done as root inside the container which is the default mapping
#UserNS=keep-id
Volume=/ai/localai/models:/build/models:Z

[Service]
TimeoutStartSec=900

[Install]
WantedBy=multi-user.target default.target

michaelwhitford avatar Oct 11 '24 18:10 michaelwhitford

Would be great if the quickstart script was program agnostic out of the box. (any docker compatible OCR container software to work).

Current state (08/06/2025)

❯ curl https://localai.io/install.sh | sh
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 33899    0 33899    0     0  51108      0 --:--:-- --:--:-- --:--:-- 51129
 [INFO]  Docker detected. 
 [INFO]  Docker detected and no installation method specified. Using Docker. 
 [INFO]  Installing LocalAI from container images 
 [INFO]  NVIDIA Container Toolkit already installed. 
 [INFO]  Starting Docker... 
Failed to start docker.service: Unit docker.service not found.

I have podman-docker, podman-compose on my arch system for compat, so it should work, it mostly does for other things, apart from some edge cases (like low level stuff that has security implications, privileged true or other flags usually fix it for me).

Anything else probably wont be LocalAI specific, it supports the Nvidia Container Toolkit, and similar. https://docs.nvidia.com/ai-enterprise/deployment/rhel-with-kvm/latest/podman.html

And i got it running manually on podman before.

Would be awesome if it was tool agnostic (any "docker" compat shim could work that way)

JamesClarke7283 avatar Jun 08 '25 13:06 JamesClarke7283

This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar Sep 10 '25 02:09 github-actions[bot]

This issue was closed because it has been stalled for 5 days with no activity.

github-actions[bot] avatar Sep 16 '25 02:09 github-actions[bot]