chainlit icon indicating copy to clipboard operation
chainlit copied to clipboard

AttributeError: 'ChatCompletionChunk' object has no attribute 'get'

Open kowlcode opened this issue 1 year ago • 2 comments

Describe the bug I am encountering an AttributeError while integrating LlamaIndex with Chainlit. I am trying to stream the response generated by the Chat Engine (GPT-4 and 4o) while using the LlamaIndexCallbackHandler, but when the response message is finished, I get the following error:

AttributeError: 'ChatCompletionChunk' object has no attribute 'get' 2024-07-26 16:30:30 - 'ChatCompletionChunk' object has no attribute 'get' [...] File "...\llama_index\core\callbacks\base.py", line 131, in on_event_end handler.on_event_end(event_type, payload, event_id=event_id, **kwargs) File "...\chainlit\llama_index\callbacks.py", line 163, in on_event_end model = raw_response.get("model", None) if raw_response else None ^^^^^^^^^^^^^^^^ [...]

Versions: LlamaIndex 0.10.58 Chainlit: 1.1.400 LLMs: GPT-4 and GPT-4o over AzureOpenAI

Workaorund

under chainlit/llama_index/callbacks.py/LlamaIndexCallbackHandler

I imported:

from openai.types.chat.chat_completion_chunk import ChatCompletionChunk

and added under on_event_end:

if raw_response: 
    if isinstance(raw_response, ChatCompletionChunk):
        model=raw_response.model
    else:
        model = raw_response.get("model", None)
else:
    model=None

instead of this:

model = raw_response.get("model", None) if raw_response else None

kowlcode avatar Jul 30 '24 14:07 kowlcode

@kowlcode I tried the same workaround steps as mentioned but I'm still getting the same error. Are there any additional steps to it?

Traceback (most recent call last):
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/chainlit/utils.py", line 44, in wrapper
    return await user_function(**params_values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/chainlit/step.py", line 112, in async_wrapper
    result = await func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/ipc_to_bns/app.py", line 328, in start
    res = agent.chat('IPC 510 in BNS')
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 260, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/callbacks/utils.py", line 41, in wrapper
    return func(self, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/agent/runner/base.py", line 646, in chat
    chat_response = self._chat(
                    ^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 260, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/agent/runner/base.py", line 578, in _chat
    cur_step_output = self._run_step(
                      ^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 260, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/agent/runner/base.py", line 412, in _run_step
    cur_step_output = self.agent_worker.run_step(step, task, **kwargs)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 260, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/callbacks/utils.py", line 41, in wrapper
    return func(self, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/agent/react/step.py", line 781, in run_step
    return self._run_step(step, task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/agent/react/step.py", line 567, in _run_step
    chat_response = self._llm.chat(input_chat)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 260, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/llms/openai_like/base.py", line 106, in chat
    return super().chat(messages, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 260, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/llms/callbacks.py", line 226, in wrapped_llm_chat
    callback_manager.on_event_end(
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/llama_index/core/callbacks/base.py", line 131, in on_event_end
    handler.on_event_end(event_type, payload, event_id=event_id, **kwargs)
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/chainlit/llama_index/callbacks.py", line 157, in on_event_end
    model = raw_response.get("model", None) if raw_response else None
            ^^^^^^^^^^^^^^^^
  File "/home/azureuser/miniconda3/envs/chainlit/lib/python3.11/site-packages/pydantic/main.py", line 828, in __getattr__
    raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')
AttributeError: 'ChatCompletion' object has no attribute 'get'
2024-08-10 06:41:36 - Translation file for en-IN not found. Using default translation en-US.

Tejaswgupta avatar Aug 10 '24 06:08 Tejaswgupta

@Tejaswgupta You still ran into error probably because the OP's code checks for ChatCompletionChunk class while in your case it is ChatCompletion.

You might want to try a slightly different workaround version like:

model = raw_response.model if hasattr(raw_response, "model") else None
# Instead of
# model = raw_response.get("model", None) if raw_response else None

dvquy13 avatar Aug 11 '24 10:08 dvquy13

Seems like the above fix could be included in a PR, would be cool to get that in

logan-markewich avatar Aug 16 '24 19:08 logan-markewich

Seems like the above fix could be included in a PR, would be cool to get that in

@logan-markewich opened the PR, thanks for the suggestion Logan!

dvquy13 avatar Aug 17 '24 14:08 dvquy13