MacOS Support
Steps to Reproduce
- clone the repo
- copy model.ckpt to models
- cd ./AUTOMATIC1111
- docker compose up --build
There are no problems with the previous steps, but the last error is reported:
Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them
[+] Running 2/2
⠿ Network automatic1111_default Created 0.0s
⠿ Container automatic1111-model-1 Created 0.1s
Attaching to automatic1111-model-1
Error response from daemon: could not select device driver "nvidia" with capabilities: [[gpu]]
Hardware / Software:
- MacOS 12.5 M1 Pro chip
- GPU: No GPU
What do I need to do to fix this?
@yuriten Mac Support is a big subject. I know that lstein fork has mac support, but I cannot integrate it because I don't have a mac.
If you or anyone would like to contribute and implement it I will gladly add it to the repo.
related #31
There's also SD in app format for Mac (Apple Silicon) now: https://github.com/divamgupta/diffusionbee-stable-diffusion-ui. Probably easiest way to get SD running there while the Docker ARM support is lagging behind.
gm, Checking in briefly. Has any progress been made on Apple M1 Silicon since Sept 2022?
@AdamGoyer Running the AUTOMATIC1111 UI without Docker following the wiki docs has been easiest and most feature rich way to run SD Web UI on M1 Mac from what I have tried out. Works quite fine with 16GB ram model at least. It uses Python machine learning packages customized for Mac hardware.
GPU passthrough for (Docker) VM is not available there and might take a while to happen.
Thank you for the follow-up @jasalt , I put up a $50 Bounty on Replet for someone to work on it. Maybe we'll get lucky. :-) https://replit.com/bounties/@adamgoyer/automatic-1111-docke I'd like to test the Docked version if possible because, for my day job, we're building a distributed network called which uses docker containers as a key component. Hopefully, this will be a fun challenge for someone.
@AdamGoyer Haha, nice. It would be cool for sure but I wonder if it's even possible without Docker adding GPU passthrough like WSL2 which only Parallels and virgl seem to be able to do in some extent (ref https://apple.stackexchange.com/a/453103/29042). Or maybe use some tricks to use the PyTorch etc. from Mac side..
@yuriten you need to remove the nvidia and gpu related dependencies in the https://github.com/AbdBarho/stable-diffusion-webui-docker/blob/master/docker-compose.yml since your machine dont have GPU as below.
version: '3.9'
x-base_service: &base_service ports: - "${WEBUI_PORT:-7860}:7860" volumes: - &v1 ./data:/data - &v2 ./output:/output stop_signal: SIGKILL tty: true deploy: resources: reservations: devices: []
name: webui-docker
services: download: build: ./services/download/ profiles: ["download"] volumes: - *v1
invoke: &invoke <<: *base_service profiles: ["invoke"] build: ./services/invoke/ image: sd-invoke:30 environment: - PRELOAD=true - CLI_ARGS=--xformers
comfy: &comfy <<: *base_service profiles: ["comfy"] build: ./services/comfy/ image: sd-comfy:3 environment: - CLI_ARGS=--cpu
comfy-cpu: <<: *comfy profiles: ["comfy-cpu"] deploy: {} environment: - CLI_ARGS=--cpu
Docker Compose Configuration: Check your Docker Compose configuration to make sure that you have specified the correct runtime and options for NVIDIA GPUs. Here is an example snippet you might include in your docker-compose.yml file: services: automatic1111-model-1: runtime: nvidia environment: - NVIDIA_VISIBLE_DEVICES=all # ... other configurations ...
@jasalt I made a docker image that spinds up locally (mac m3) and launches the AUTOMATIC1111/stable-diffusion-webui through localhost:7860
I got it to work using the vscode method from think link but that only worked in vscode cuz gradio is so fun. So I hacked together an image you can build and run locally, without requiring vscode; just needs the simple commands:
docker build -t sdwebui:lite -f Dockerfile .
docker run -it -p 7860:7860 sdwebui:lite
This has been tested and works, but the Dockerfile is a mess haha. Deff not something good enough to submit for a PR yet. Whenever I have the time Ill clean it up and put in a PR, or anyone else who wants to make it prettier/remove duplicate logic can. Until then, Im currently in the process of pushing it to docker hub:
docker image tag sdwebui:lite letsstartdocking/sdwebui:lightonmodels
docker push letsstartdocking/sdwebui:lightonmodels
All this to say -- its possible, I have the code; just need to find the time so a PR isn't embarrassingly ugly haha.
anyone who wants access to the really bad dockerfile can have it (just no shaming until its cleaned up haha)
PR is open, please see notes/comments for discussion.
mac boookpro M1 芯片build报错 could not select device driver "nvidia" with capabilities: [[compute utility]]