memory problem
Hello. How much memory do you need to run?
RuntimeError: CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 6.00 GiB total capacity; 4.78 GiB already allocated; 0 bytes free; 4.82 GiB reserved in total by PyTorch)
Any solution to run with a small amount of memory?
some stuff that can help (from https://t.co/KctR4TMotT)
- in
text2img.py, line 18 changemap_location="cpu"formap_location="cuda" - you can remove line 29 and add
model.half() - then, add the following between line 119 and 120
with torch.cuda.amp.autocast():
@vipermu
The process has begun, although the result is green squares

I'm unable to run this on a 6gb graphics card due to the same memory error, even with the changes proposed.
Is there anything else I could try or a way to run a smaller mode?
I'm unable to run this on a 6gb graphics card due to the same memory error, even with the changes proposed.
Is there anything else I could try or a way to run a smaller mode?
Unfortunately the more or less appropriate results can be achived with 8gb or more VRAM.
Nevertheless you can try to reduce the number of pictures by changing --n_samples and --n_iter values.
You could also change the output resolution for the images. For example try this:
python scripts/txt2img.py --prompt "a dog eating a hamburger in space" --ddim_eta 0.0 --n_samples 3 --n_iter 3 --scale 5.0 --ddim_steps 100 --H 192 --W 192
same here, the process get killed right after the start.
python scripts/txt2img.py --prompt "a dog eating a hamburger in space" --ddim_eta 0.0 --n_samples 3 --n_iter 3 --scale 5.0 --ddim_steps 100 --H 192 --W 192
Loading model from models/ldm/text2img-large/model.ckpt
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 872.30 M params.
making attention of type 'vanilla' with 512 in_channels
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
making attention of type 'vanilla' with 512 in_channels
[1] 12177 killed python scripts/txt2img.py --prompt "a dog eating a hamburger in space" 0.0
The machine itself
CPU: 11th Gen Intel i9-11900H (16) @ 4.800GHz
GPU: Intel TigerLake-H GT1 [UHD Graphics]
GPU: NVIDIA GeForce RTX 3060 Mobile / Max-Q
Memory: 5752MiB / 15678MiB (36%)
I wonder if its because the script is trying to use my intel graphic card instead of nvidia
@adelin-b I'm thinking the same thing, did you find a solution?
@adelin-b Looks like with os.environ["CUDA_VISIBLE_DEVICES"]=<YOUR_GPU_NUMBER_HERE> my laptop is using the correct gpu
@hiop hi,Have you solved the problem of running out of memory
@hiop hi,Have you solved the problem of running out of memory
yes I bought 4070
@hiop ok,thank you