agent-zero icon indicating copy to clipboard operation
agent-zero copied to clipboard

Analysis of Agent Zero Image Generation Issues

Open keyboardstaff opened this issue 6 months ago • 2 comments

I encountered an interesting situation while using Agent Zero for image generation. When I requested Agent Zero to help generate images, it initially informed me that the current version does not support image generation functionality. However, Agent Zero simultaneously suggested that I could use Python to call image generation libraries such as Stable Diffusion.

So I asked Agent Zero to directly utilize Stable Diffusion to help me generate images. Surprisingly, Agent Zero did automatically download Stable Diffusion and related models, and began calling Stable Diffusion for image generation. But during the actual image generation process, the system encountered an error.

Additional Information: I am currently using the 0.94 test version with openrouter api.

Subsequent Discovery: However, I also found that two simple images were generated in the root/generated_images directory.

Text	
litellm.NotFoundError: NotFoundError: OpenrouterException - {"error":{"message":"No endpoints found that support image input","code":404}}
Traceback (most recent call last):
Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 111, in _make_common_async_call
    response = await async_httpx_client.post(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 135, in async_wrapper
    result = await func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 324, in post
    raise e
  File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 280, in post
    response.raise_for_status()
  File "/opt/venv/lib/python3.12/site-packages/httpx/_models.py", line 829, in raise_for_status
httpx.HTTPStatusError: Client error '404 Not Found' for url 'https://openrouter.ai/api/v1/chat/completions'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 538, in acompletion
    response = await init_response
               ^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 600, in acompletion_stream_function
    completion_stream, _response_headers = await self.make_async_call_stream_helper(
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 659, in make_async_call_stream_helper
    response = await self._make_common_async_call(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

>>>  8 stack lines skipped <<<

  File "/a0/models.py", line 317, in unified_call
    _completion = await acompletion(
                  ^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/litellm/utils.py", line 1552, in wrapper_async
    raise e
  File "/opt/venv/lib/python3.12/site-packages/litellm/utils.py", line 1410, in wrapper_async
    result = await original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 557, in acompletion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2293, in exception_type
    raise e
  File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2173, in exception_type
    raise NotFoundError(
litellm.exceptions.NotFoundError: litellm.NotFoundError: NotFoundError: OpenrouterException - {"error":{"message":"No endpoints found that support image input","code":404}}


litellm.exceptions.NotFoundError: litellm.NotFoundError: NotFoundError: OpenrouterException - {"error":{"message":"No endpoints found that support image input","code":404}}

keyboardstaff avatar Aug 10 '25 09:08 keyboardstaff

However, I also found that two simple images were generated in the root/generated_images directory.

Image Image Image

keyboardstaff avatar Aug 10 '25 09:08 keyboardstaff

Very nice, maybe the model you were using didn't support image input.

linuztx avatar Aug 17 '25 22:08 linuztx