throttlekitty

Results 8 comments of throttlekitty

There's an os.chdir ".." in the first(?) cell, I believe the notebook assumes you launched from inside the inference folder. Try commenting it out or just setting your cascade folder...

Likely the same problem I had at first. Check that the config filenames match the models you downloaded. It's set to use models/stage_#_bf16.safetensors.

Looks like it's the prompt reader node. https://github.com/receyuki/comfyui-prompt-reader-node/issues/21

The notebook uses configs/stage_c_3b.yaml, which wants to use the bf16 version of the model, same for the Stage B config. I had already manually downloaded the full versions, so it...

The config file in stage_c_3b.yaml is already set to generator_checkpoint_path: models/stage_c_bf16.safetensors This is where I have the models located. H:\ai\StableCascade\models\ Good to know about the CLIP models, thanks. I did...

I was just able to build torchmcubes on windows 11 through torch 2.2.1+cu121. I have MSVC2019 installed, and used their x64 Native Tools Command Prompt. Clone [torchmcubes](https://github.com/tatsy/torchmcubes/) somewhere else, then...

> I cannot reproduce this issue, everything is working fine on my end. So I guess there might be a conflict between the prompt reader and another extension. It might...

A bit of a noob here, but I have a workaround. I had built llama.cpp with VS2022 using cmake. I had a llama.cpp\bin\Releases with the resulting dll and exe files,...