[BUG] OpenAI internal call broken
Bug Description
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'reasoning.effort' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'reasoning.effort', 'code': 'unsupported_parameter'}}
Steps to Reproduce
from graphiti_core.llm_client.openai_client import OpenAIClient
openai_llm_config = LLMConfig(
model=os.getenv("OPENAI_MODEL", "gpt-4o"),
api_key=openai_api_key,
max_tokens=128000,
small_model="gpt-4o"
)
openai_llm_client = OpenAIClient(config=openai_llm_config)
# Use OpenAI as default
llm_client = openai_llm_client
print("Using OpenAI client...")
graphiti = Graphiti(NEO4J_URI, NEO4J_USER, NEO4J_PASSWORD, llm_client=llm_client)
Expected Behavior
Probably something OpenAI changed and is not handled when calling the OpenAI client. Shoulnd't be an issue whatsoever.
I see the default values being 'gpt5-mini' which makes sense now, since the rest models do not have "reasoning" attribute. Can we add a fallback without the reasoning argument to enable other openai models again? Thanks
I am having the same error!
Same error with the same package.
i have the same error
same error with me too am using gpt 4.1
graphiti_core.llm_client.openai_base_client - ERROR - Error in generating LLM response: Error code: 400 - {'error': {'message': "Unsupported parameter: 'reasoning.effort' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'reasoning.effort', 'code': 'unsupported_parameter'}}
Same this side trying to use gpt-5-mini
@GioPetro Is this still an issue? Please confirm within 14 days or this issue will be closed.
same error, what I need to do?