transfiner icon indicating copy to clipboard operation
transfiner copied to clipboard

CUDA out of memory

Open pppwzj opened this issue 3 years ago • 5 comments

Your team has done an excellent job. I would like to know that when I use four NVIDIA RTX 2080 and the batch_size is set to the minimum of 4, the output is always ' CUDA out of memory' when I run it. I would like to know if there are any parameters in the model that can be reduced to solve this problem. Thank you very much.

pppwzj avatar Nov 05 '22 06:11 pppwzj

During training, you can reduce the parameter here and here to save memory.

lkeab avatar Nov 05 '22 17:11 lkeab

During training, you can reduce the parameter here and here to save memory.

during test, how to save memory? Thank you.

pppwzj avatar Nov 10 '22 04:11 pppwzj

During training, you can reduce the parameter here and here to save memory.

Could you tell me if I reduce the parameter LIMIT, will it have any effect on the model? Will it reduce the performance? Thank you.

pppwzj avatar Nov 11 '22 02:11 pppwzj

limited performance decrease if you are not reducing it extremely; for inference, you can refer to here

lkeab avatar Nov 15 '22 15:11 lkeab

I reduce both limit from 30 to 10 , but still cuda out of memory on ti2080 12G

perp avatar Nov 29 '22 04:11 perp