lighteval
lighteval copied to clipboard
Improved stability of litellm models for reasoning models.
Hey @JoelNiklaus, since you have made most commits around LiteLLM. I have few questions/requests
- Why not expose entire LiteLLM config as parameter or something?
- I'm actually looking to run open-r1 evals using for instance models deployed at deepinfra with temp=0. Is it possible right now?
Hi @satpalsr,
- When I integrated litellm, I made it very similar to the existing openai_model. But I agree with you that we may want to configure more. I am sure the maintainers are open if you want to open a PR for that :)
- I am not familiar with deepinfra, but since it seems supported by litellm it should work out of the box.
@JoelNiklaus Thanks. I'll just drop it as separate issue for anyone to pick. For time being, I just modified the OpenAIClient code & got my evals done.