ollama-python
ollama-python copied to clipboard
Thinking with Structured Output does not work
Updated to the latest to use the new thinking mode along with structured outputs and ran into an error. I typically make the call using:
response = client.chat(
model=model,
messages=messages,
format=output_model.model_json_schema(),
think=True, # added this recently
options={"temperature": 0, "num_ctx": 16384}
)
and process the output using:
output = output_model.model_validate_json(response.message.content)
This works still when think is omitted however with it added and set to True, I get the following output:
role='assistant' content='{\n\n{\n\n"vte_present": "uncertain",\n"reason_for_answer": "Positive study for acute upper extremity nonocclusive DVT in the axillary vein (deep vein) but evidence of occlusive superficial venous thrombus in the cephalic or basilic veins. Since superficial venous thrombosis is not considered VTE, and there\'s a possibility that the axillary vein finding might be chronic or uncertain, it should be classified as uncertain."\n}\n\n ' thinking=None images=None tool_calls=None
Notice the double opening curly brace and that thinking is None. Please let me know if I can provide any other details.
I got same problem and I think we need to add test function with think parameter!