[FEATURE] support for gemini 2.5?
Feature Area
Core functionality
Is your feature request related to a an existing bug? Please link it here.
NA
Describe the solution you'd like
Na
Describe alternatives you've considered
No response
Additional context
No response
Willingness to Contribute
Yes, I'd be happy to submit a pull request
@vmsaif @joaomdmoura I'm working on this feature improvement and will also try to integrate the GPT-4.1 models that were released yesterday.
assign task
Do I add all the model ?
and the gemma3 models are not yet live in groq.
hi @LuniaKunal thank you soo much for collaboration.
Take a look at this PR you can use as reference
The PR creates a new test file there is already a llm_test.py. I am adding the test case in llm_test.py.
@lucasgomide can you review?
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.
Are we planning to support gemini-2.5-flash and other 2.5 models