DriveLM
DriveLM copied to clipboard
Is 3090 GPU sufficient for inference?
Hello, I can finetune the model with bs=1 for training. But in the inference stage, I set the bs=1 and it is out of memory, which is quite confusing. Is there any parameters that I forget to set?
A possible reason might be when inference, some of the question length are way longer than others, which will require more gpu memory. I would suggest finding which cases cause such error.