[bug]: OSError: stable-diffusion-v1-5/stable-diffusion-v1-5 does not appear to have a file named config.json.
Is there an existing issue for this problem?
- [X] I have searched the existing issues
Operating system
Windows
GPU vendor
Nvidia (CUDA)
GPU model
RTX 3070
GPU VRAM
No response
Version number
5.3.1
Browser
Chrome 130.0.6723.92
Python dependencies
{ "accelerate": "1.0.1", "compel": "2.0.2", "cuda": "12.4", "diffusers": "0.31.0", "numpy": "1.26.4", "opencv": "4.9.0.80", "onnx": "1.16.1", "pillow": "11.0.0", "python": "3.10.9", "torch": "2.4.1+cu124", "torchvision": "0.19.1+cu124", "transformers": "4.41.1", "xformers": null }
What happened
When I attempt to generate with an sdxl controlnet, I get the following in the console:
[2024-11-04 21:23:59,857]::[InvokeAI]::ERROR --> Error while invoking session 306f40a3-987c-410c-8e10-f3cf06d8c400, invocation 72f4d084-cf86-499d-b5d1-ec9905814de1 (denoise_latents): stable-diffusion-v1-5/stable-diffusion-v1-5 does not appear to have a file named config.json.
[2024-11-04 21:23:59,874]::[InvokeAI]::ERROR --> Traceback (most recent call last):
File "F:\AI\InvokeAI\.venv\lib\site-packages\huggingface_hub\utils\_http.py", line 406, in hf_raise_for_status
response.raise_for_status()
File "F:\AI\InvokeAI\.venv\lib\site-packages\requests\models.py", line 1024, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/stable-diffusion-v1-5/stable-diffusion-v1-5/resolve/main/config.json
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "F:\AI\InvokeAI\.venv\lib\site-packages\diffusers\configuration_utils.py", line 379, in load_config
config_file = hf_hub_download(
File "F:\AI\InvokeAI\.venv\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "F:\AI\InvokeAI\.venv\lib\site-packages\huggingface_hub\file_download.py", line 862, in hf_hub_download
return _hf_hub_download_to_cache_dir(
File "F:\AI\InvokeAI\.venv\lib\site-packages\huggingface_hub\file_download.py", line 925, in _hf_hub_download_to_cache_dir
(url_to_download, etag, commit_hash, expected_size, head_call_error) = _get_metadata_or_catch_error(
File "F:\AI\InvokeAI\.venv\lib\site-packages\huggingface_hub\file_download.py", line 1376, in _get_metadata_or_catch_error
metadata = get_hf_file_metadata(
File "F:\AI\InvokeAI\.venv\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "F:\AI\InvokeAI\.venv\lib\site-packages\huggingface_hub\file_download.py", line 1296, in get_hf_file_metadata
r = _request_wrapper(
File "F:\AI\InvokeAI\.venv\lib\site-packages\huggingface_hub\file_download.py", line 277, in _request_wrapper
response = _request_wrapper(
File "F:\AI\InvokeAI\.venv\lib\site-packages\huggingface_hub\file_download.py", line 301, in _request_wrapper
hf_raise_for_status(response)
File "F:\AI\InvokeAI\.venv\lib\site-packages\huggingface_hub\utils\_http.py", line 417, in hf_raise_for_status
raise _format(EntryNotFoundError, message, response) from e
huggingface_hub.errors.EntryNotFoundError: 404 Client Error. (Request ID: Root=1-672981c0-1e72c8e64f61a1ae2464aca9)
Entry Not Found for url: https://huggingface.co/stable-diffusion-v1-5/stable-diffusion-v1-5/resolve/main/config.json.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "F:\AI\InvokeAI\.venv\lib\site-packages\invokeai\app\services\session_processor\session_processor_default.py", line 129, in run_node
output = invocation.invoke_internal(context=context, services=self._services)
File "F:\AI\InvokeAI\.venv\lib\site-packages\invokeai\app\invocations\baseinvocation.py", line 298, in invoke_internal
output = self.invoke(context)
File "F:\AI\InvokeAI\.venv\lib\site-packages\invokeai\app\invocations\denoise_latents.py", line 812, in invoke
return self._old_invoke(context)
File "F:\AI\InvokeAI\.venv\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "C:\Users\jimmy\AppData\Local\Programs\Python\Python310\lib\contextlib.py", line 79, in inner
return func(*args, **kwds)
File "F:\AI\InvokeAI\.venv\lib\site-packages\invokeai\app\invocations\denoise_latents.py", line 1045, in _old_invoke
controlnet_data = self.prep_control_data(
File "F:\AI\InvokeAI\.venv\lib\site-packages\invokeai\app\invocations\denoise_latents.py", line 438, in prep_control_data
control_model = exit_stack.enter_context(context.models.load(control_info.control_model))
File "F:\AI\InvokeAI\.venv\lib\site-packages\invokeai\app\services\shared\invocation_context.py", line 375, in load
return self._services.model_manager.load.load_model(model, _submodel_type)
File "F:\AI\InvokeAI\.venv\lib\site-packages\invokeai\app\services\model_load\model_load_default.py", line 70, in load_model
).load_model(model_config, submodel_type)
File "F:\AI\InvokeAI\.venv\lib\site-packages\invokeai\backend\model_manager\load\load_default.py", line 56, in load_model
locker = self._load_and_cache(model_config, submodel_type)
File "F:\AI\InvokeAI\.venv\lib\site-packages\invokeai\backend\model_manager\load\load_default.py", line 77, in _load_and_cache
loaded_model = self._load_model(config, submodel_type)
File "F:\AI\InvokeAI\.venv\lib\site-packages\invokeai\backend\model_manager\load\model_loaders\controlnet.py", line 50, in _load_model
return ControlNetModel.from_single_file(
File "F:\AI\InvokeAI\.venv\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "F:\AI\InvokeAI\.venv\lib\site-packages\diffusers\loaders\single_file_model.py", line 268, in from_single_file
diffusers_model_config = cls.load_config(
File "F:\AI\InvokeAI\.venv\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "F:\AI\InvokeAI\.venv\lib\site-packages\diffusers\configuration_utils.py", line 406, in load_config
raise EnvironmentError(
OSError: stable-diffusion-v1-5/stable-diffusion-v1-5 does not appear to have a file named config.json.
What you expected to happen
Just expected it to generate using the sdxl controlnet.
How to reproduce the problem
Start the app, select an sdxl model, add an sdxl controlnet, click Invoke.
Additional context
No response
Discord username
No response
This same issue is popping up for me when I try to use the upscaling tab. Pretty much the exact same stack trace as OP. From what I can tell the huggingface repo for stable-diffusion-v1-5/stable-diffusion-v1-5 has changed and doesn't include a config.json file (anymore?). From what I could tell with my limited knowledge of the InvokeAI codebase and some trudging, this is more or less what happens:
t tries to fetch the diffusers config for a controlnet checkpoint through fetch_diffusers_config, but for some reason it can't resolve what type it is and thus it ends up in the final "else" statement of that function which automatically returns "v1", and then it goes and tries to fetch SD1.5 stuff. I've tried three different checkpoints (base pony v6, RainPonyXLv2 and a Noob spin-off merge) and all fail at this point. I changed single_file_model.py to log print mapping_class_name at the start of from_single_file, and that printed "ControlNetModel" on the attempt that it tries to go to huggingface and fails.
And this is where my knowledge ends. So either we don't actually need the SD1.5 (which seems likely as we're using SDXL models) and whatever ControlNetModel needs should be parsed correctly, OR ControlNetModel doesn't actually need anything and is accidentally being misidentified as "v1".
And I figure this only affects new users, older users should already have the files in their cache, seems likely that it's loading them from there if available.
Hope to see a fix for this soon because right now I can't use the upscaler!
Found the root cause, at least for the upscaler: It tries to load the following controlnet: key='477387a4-2a3d-46f4-a890-366335cd0184' hash='blake3:2aa3205a5e9f25f2f7a52c18f685917206320964f3c9b366dc0be941235c9aa9' name='xinsir-controlnet-tile-sdxl-1.0' base=<BaseModelType.StableDiffusionXL: 'sdxl'> type=<ModelType.ControlNet: 'controlnet'> submodel_type=None
But this cannot be parsed through the CHECKPOINT_KEY_NAMES method, so it always returns "v1", but (at least for xinsir tile) we want to be loading from xinsir/controlnet-tile-sdxl-1.0
Wow! Good job!
Is there a way to sort of 'force' it to load from xinsir?
Wow! Good job!
Is there a way to sort of 'force' it to load from xinsir?
The only way I've found to resolve this issue is by installing whatever controlnet you're trying to use via the model manager using it's huggingface link, then the model manager properly imports the config.json file and it won't try to download it anymore. Controlnets aren't THAT big, but it's still a major annoyance when you've got multiple SD installations that share models.
Exactly. Alright, I'll try that then. Thank you so much.
you can literally use manger to download file move it then try to add it which it will then try to use it and it will throw an error
Server Error
OSError: stable-diffusion-v1-5/stable-diffusion-v1-5 does not appear to have a file named config.json.
I can't resolve this issue either, and its effecting everyone of my controlnets.
Using version 5.6.0, I have found a workaround which worked for me.
I had the same issue with the ”QR Code Monster” Control Nets (both the SD1.5 and SDXL versions).
As someone above said, even downloading through the model manager (URL or Local Path menu) did not work when using a Civitai download URL.
But downloading through the Starter Models menu works, and I believe that's because it gets the huggingface repo. After downloading through there, both the SD1.5 and the SDXL QR Code Monster controlnets work normally. Maybe it gets the right config from the repo.
This piece of software is already annoying with its errors. It's easier to use Comfyui, which even with thousands of installed NODs works much more stable. And it would be okay if the errors were logical, but it does not work because of trivial things and does not offer a solution to the problem. The problem is already half a year old, at least put a hint in the console, what can be done. I'm fucking sick of this program. If it doesn't work, you can't do shit. Python is easier to figure out. Does he need a config? I got it. What next? Where do I put it? There's nothing in the program folder that would hint at anything.
Same issue , is there a solution , can't use invoke anymore
Same issue , is there a solution , can't use invoke anymore
Have you tried downloading from hugging face?
Same issue , is there a solution , can't use invoke anymore
Have you tried downloading from hugging face?
my controlnet models were downloaded from civitai: outfit2outfit , control_v1p_sd15_illumination , control_v1p_sd15_brightness , they are all safetensors without config.json files .. even if i upload them to huggingface and donwnload them from invoke manager i get always the same error message. there was no problem with them when i used them with the old versions of invoke ( last one i had was the version without the community launcher ) .. the models works well in comfyui
Same problem with one of the controlnets listed above, downloaded from civitai. Tried downloading it via model manager from HF library, no dice. There should be a simple (if wrong) way to work around this. Any other ideas?
Looking a little more, the thing is that either way I'm just getting a safetensors file, which the model manager deposits in models/sd-1/controlnet, not in its own subdir, and there's no config.json file that comes with it. Presumably one could move the file into a subdir, if only we had the config.json to work with.