Cole McIntosh

Results 14 comments of Cole McIntosh

@dailydaniel I'd be happy to take this on - will provide updates here 👍

@dailydaniel I *think* Mistral's API is OpenAI API compatible, so you should be able to use the OpenAI provider but with your mistral details ``` export OPENAI_API_KEY="your-mistral-api-key" export OPENAI_API_BASE="https://api.mistral.ai/v1" ```

@dailydaniel I just tried it and can confirm it is not compatible...will try create a PR to add Mistral as a provider

@zhangela you're close, but the issue is with the `thinking` parameter's format (@comadan good catch) and missing some required parameters (`tool_choice` and `max_tokens`). The correct format is: ```python thinking={"type": "enabled",...

@ishaan-jaff it seems like response_format is using tool calling under the hood and is forcing a tool call, maybe for thinking models default the tool choice to auto?

@a-rbsn the below code is working for me - can you confirm if you have a different setup? ``` from litellm import completion response = completion( model="xai/grok-2-latest", messages=[ {"role": "user",...

@yhenon good catch! https://github.com/BerriAI/litellm/pull/9286

@a-rbsn yes this is a separate issue, can you open an issue with this alongside a snippet of the code that led to this?

@cauchy221 I was able to use Claude 3.7 Sonnet w/ thinking on a proxy server using this code: ```python import litellm response = litellm.completion( api_key="sk-************", # The master_key from config...