How to use drop conditioning during training?
Hi!
Most of the SD checkpoints mention "dropping of the text-conditioning to improve classifier-free guidance sampling." However, I couldn't find the config parameter that does this nor the code that does this. I would appreciate it if you would point to it.
Also, do you drop conditioning for a whole batch in 10% of the cases or do you drop 10% of examples in the batch?
The easiest way to do this when training is inside your custom dataset class (notice that this repo doesn't come with its own training code, so you'll have to write that yourself - perhaps this repo can help).
Basically in your Dataset class, in the __getitem__ method, you should replace the prompt with an empty string 10% of the time. It should look something like this:
prompt = impath[impath.rfind("/")+1:impath.rfind("-ID")] if random.randint(0, 1) > 0.1 else ""