InvokeAI icon indicating copy to clipboard operation
InvokeAI copied to clipboard

[bug]: AttributeError: 'NoneType' object has no attribute 'tokens_count_including_eos_bos'

Open nelsonre opened this issue 3 years ago • 2 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues

OS

macOS

GPU

mps

VRAM

64

What happened?

When using using the inpainting model on the unified canvas, I get the following error: tokens_count_including_eos_bos

Traceback (most recent call last): File "/Users/xxxxxx/Documents/invokeAI-installer/backend/invoke_ai_web_server.py", line 1124, in generate_images self.generate.prompt2image( File "/Users/xxxxxx/Documents/invokeAI-installer/ldm/generate.py", line 488, in prompt2image results = generator.generate( File "/Users/xxxxxx/Documents/invokeAI-installer/ldm/invoke/generator/base.py", line 98, in generate image = make_image(x_T) File "/Users/xxxxxx/Documents/invokeAI-installer/ldm/invoke/generator/omnibus.py", line 123, in make_image samples, _ = sampler.sample( File "/Users/xxxxxx/Documents/invokeAI-installer/installer_files/env/envs/invokeai/lib/python3.10/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/Users/xxxxxx/Documents/invokeAI-installer/ldm/models/diffusion/ksampler.py", line 211, in sample attention_map_token_ids = range(1, extra_conditioning_info.tokens_count_including_eos_bos - 1) AttributeError: 'NoneType' object has no attribute 'tokens_count_including_eos_bos'

The error appears to only happen when using the samplers that start with k_*. "ddim" and "plms" don't cause the error to happen.

Screenshots

No response

Additional context

No response

Contact Details

No response

nelsonre avatar Dec 11 '22 00:12 nelsonre

I made small changes to the ksampler.py file to make the other samplers work. Not certain this is the correct solution though.

diff --git a/ldm/models/diffusion/ksampler.py b/ldm/models/diffusion/ksampler.py
index 894be54b..fd827575 100644
--- a/ldm/models/diffusion/ksampler.py
+++ b/ldm/models/diffusion/ksampler.py
@@ -208,7 +208,9 @@ class KSampler(Sampler):
         model_wrap_cfg = CFGDenoiser(self.model, threshold=threshold, warmup=max(0.8*S,S-10))
         model_wrap_cfg.prepare_to_sample(S, extra_conditioning_info=extra_conditioning_info)
 
-        attention_map_token_ids = range(1, extra_conditioning_info.tokens_count_including_eos_bos - 1)
+        if extra_conditioning_info is not None:
+            attention_map_token_ids = range(1, extra_conditioning_info.tokens_count_including_eos_bos - 1)
+            attention_maps_callback = None
         attention_maps_saver = None if attention_maps_callback is None else AttentionMapSaver(token_ids = attention_map_token_ids, latents_shape=x.shape[-2:])
         if attention_maps_callback is not None:
             model_wrap_cfg.invokeai_diffuser.setup_attention_map_saving(attention_maps_saver)

nelsonre avatar Dec 11 '22 01:12 nelsonre

I'm having the same issue OS: windows GPU: cuda VRAM: 6GB

Caio-Sc avatar Dec 11 '22 08:12 Caio-Sc