Kostya
Kostya
I've faced the same problem of missing support for rfft/irfft ops and complex numbers in coreml while trying to convert spleeter model from deezer to coreml. Worked it around by...
I've faced the same problem of missing support for rfft/irfft ops and complex numbers in coreml while trying to convert spleeter model from deezer to coreml. Worked it around by...
I've faced the same problem of missing support for rfft/irfft ops and complex numbers in coreml while trying to convert spleeter model from deezer to coreml. Worked it around by...
You can store `samples` from `samples = sampler.decode(z_enc, c, t_enc, unconditional_guidance_scale=opt.scale,unconditional_conditioning=uc,)` and later use it instead of `init_latent` here `z_enc = sampler.stochastic_encode(init_latent, torch.tensor([t_enc]*batch_size).to(device))`
I'm not sure, but have you tried the models `semantic_synthesis512` and `semantic_synthesis256` from https://github.com/CompVis/stable-diffusion/blob/main/scripts/download_models.sh ?
`--n_samples A.k.a batch size` is literally a batch size, increasing it you will try to fit more and more data to your GPU for a single run, increasing `n_iter` on...
No, it can't. I think here you confuse training phase (when changing batch size actually can drastically affect resulting model quality) with inference phase (when changing batch size only affects...
@jshaw Thank you for your answer! It works for me too to evaluate using the checkpoint folder, but what I was actually asking is how to convert the mess in...