Julia
Julia
Hi, I am also trying to run Story-DALL-E using NVIDIA V100 which has 32 GB. Even inference does not work, I get such an error: `RuntimeError: CUDA error: CUBLAS_STATUS_EXECUTION_FAILED when...
Could you please explain how does this script work? I have put an image in Inference_Path and annotations from Coco2017 dataset to EDPOSE_COCO_PATH. I tried to run the script _[Virtualization...
I have already done it. I also downloaded COCO annotations and put them in EDPOSE_COCO_PATH, but it doesn't work. Why does inference code on one image need the whole COCO...
Well, is there a simple way to just load the model and infer on a single image without using the dataloader? I tried to do this on a torch tensor,...
It doesn't help, because modelscope has its own definition of UNet.