InvokeAI icon indicating copy to clipboard operation
InvokeAI copied to clipboard

[bug]: InvokeAI incorrectly recognizes the model type

Open DeXtmL opened this issue 1 year ago • 1 comments

Is there an existing issue for this problem?

  • [X] I have searched the existing issues

Operating system

Windows

GPU vendor

Nvidia (CUDA)

GPU model

4090

GPU VRAM

24GB

Version number

4.2.0

Browser

Edge latest

Python dependencies

InvokeAI incorrectly recognizes the model type

What happened

InvokeAI refuses to load some models (unselectable). Editing the model type gives an error.

What you expected to happen

Non-standard models are not uncommon, InvokeAI should at least allow user to forcibly override the base model type. Better if InvokeAI provides a button to go back to the default in case the user messes up.

How to reproduce the problem

Let's take the WildCardX-XL lora as an example:

  1. download and import the lora from: https://civitai.com/models/264681?modelVersionId=298448
  2. Note that InvokeAI recognizes the model as SD-1 Lora, where it is actually a SDXL Lora. (screenshot 1)
  3. Trying to edit the model type gives an error: "Model Update Failed" then forcibly revert back to Sd-1 (screenshot 2, screenshot 3)

s1

s2

s3

Additional context

No response

Discord username

No response

DeXtmL avatar May 26 '24 12:05 DeXtmL

Error message: [InvokeAI]::ERROR --> 1 validation error for LoRALyCORISConfig variant Object has no attribute 'variant' [type=no_such_attribute, input_value=<ModelVariantType.Normal: 'normal'>, input_type=ModelVariantType] For further information visit https://errors.pydantic.dev/2.6/v/no_such_attribute

DeXtmL avatar May 26 '24 12:05 DeXtmL

[2024-12-11 18:09:12,490]::[ModelInstallService]::ERROR --> Model install error: black-forest-labs/FLUX.1-schnell InvalidModelConfigException: Unknown base model for /Volumes/frankXmini/frankmini/invokeai/models/tmpinstall_7z13gb4r/FLUX.1-schnell

frankvegastudio avatar Dec 11 '24 23:12 frankvegastudio