CaTTail
Results
2
comments of
CaTTail
> Hi! @RickMeow > > You are using DDP, so you need to ensure that each GPU can fully load the Llama2-70b-QLoRA, but this is challenging for 40GB GPU. >...
> Yes, 3*8*A100 (40G) is enough for fine-tuning llama-2-70B Thank you for your efficient and enthusiastic answers! **I've used two different commands and I'm still getting OOM, is there something...