generative-ai-python icon indicating copy to clipboard operation
generative-ai-python copied to clipboard

TypeError Could not create `Blob`, expected `Blob`, `dict` or an `Image` type(`PIL.Image.Image` or `IPython.display.Image`)

Open petrgazarov opened this issue 1 year ago • 4 comments

Description of the bug:

Getting this error sometimes. What does it mean? If I rerun the script, it is not raised every time, but only sometimes.

REQUEST_TIMEOUT = 90
    
# model is `gemini-1.5-flash`    
client = genai.GenerativeModel(model, system_instruction=system_instruction)
    
google_completion = await asyncio.wait_for(
    client.generate_content_async(
        messages,
        safety_settings={
            HarmCategory.HARM_CATEGORY_HARASSMENT: HarmBlockThreshold.BLOCK_NONE,
            HarmCategory.HARM_CATEGORY_HATE_SPEECH: HarmBlockThreshold.BLOCK_NONE,
            HarmCategory.HARM_CATEGORY_SEXUALLY_EXPLICIT: HarmBlockThreshold.BLOCK_NONE,
            HarmCategory.HARM_CATEGORY_DANGEROUS_CONTENT: HarmBlockThreshold.BLOCK_NONE,
        },
    ),
    REQUEST_TIMEOUT,
)

Actual vs expected behavior:

Actual:

TypeError Could not create `Blob`, expected `Blob`, `dict` or an `Image` type(`PIL.Image.Image` or `IPython.display.Image`).
Got a: <class 'NoneType'>
Value: None

Expected there to be no error

Any other information you'd like to share?

No response

petrgazarov avatar Jun 06 '24 06:06 petrgazarov

Right, that's not the best error.

This is at the step where it's trying to convert your input into the classes the api actually uses.

And you have a None in your list of contents or one of their lists of parts.

I should make these errors clearer. You need to track down where that None is coming from.

MarkDaoust avatar Jun 10 '24 14:06 MarkDaoust

I am assuming you are using chainlit here. I too faced this error. When passing the message to gemini, instead of passing message, it worked for me on passing message.content. Check the following example:

# This function handles incoming messages from the user
@cl.on_message
async def on_message(message: str):
    global chat_session  # Use the chat session initialized in on_chat_start

    try:
        # Send the user message to the Gemini model using the existing chat session
        response = chat_session.send_message(message.content)
        print("reached here")
        print("response : ", response)

        # Get the model's response text (ensure it's plain text)
        response_text = response.text.strip()  # Strip any extra whitespace
        print("response_text : ", response_text)

        # Ensure the response is not empty
        if not response_text:
            response_text = "Sorry, I didn't get a valid response from the model."

        # Send the model's response back to the user in Chainlit
        await cl.Message(content=response_text).send()  # Ensure only plain text is sent

    except Exception as e:
        # Handle errors and display them to the user
        await cl.Message(content=f"Error: {str(e)}").send()

dev02chandan avatar Oct 17 '24 06:10 dev02chandan

I also encoutered this error when i use Gemini api in dify workflow.

zengqingfu1442 avatar Dec 14 '24 15:12 zengqingfu1442

I ran into the same issue. I was reading the "Google Gen AI SDK" that describes the conversions done when you pass text to generate_content. The doc describes all the types used internally, so i create my content using the google types and ran into the issue. As it turns out the sdk wants or needs to convert the passed in data, it can do it from many types, but not googles types. At the end of the day passing a dict with "user" and "content" will do.

dcssi avatar Apr 22 '25 01:04 dcssi