crewAI
crewAI copied to clipboard
Allow local llm to use openAI style tool calls
I'm running LLM locally, and it supports openAI compatible apis. And I'm using specific models for function calls to get a better success rate.
But right now the way of tool usage is hard coded. Do you think we can make it configurable?
def to_pydantic(self, current_attempt=1):
"""Convert text to pydantic."""
try:
if self._is_gpt:
return self._create_instructor().to_pydantic()
else:
return self._create_chain().invoke({})
except Exception as e:
if current_attempt < self.max_attemps:
return self.to_pydantic(current_attempt + 1)
return ConverterError(
f"Failed to convert text into a pydantic model due to the following error: {e}"
)
def _is_gpt(self, llm) -> bool:
return isinstance(llm, ChatOpenAI) and llm.openai_api_base == None
Hi, I have the same problem, have you found a solution now?
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.