Aaron Mihalik
Aaron Mihalik
> hi @adinin, I believe the issue is related to the type of the `bos_token` and `eos_token` in the `tokenizer_config.json`. Currently TGI expects the tokens to be of type string...
I'm super excited about this functionality, but I'm new to litellm. Also, I think I have the same use case as @peterz3g: I want to use a non-OpenAI model with...
+1 Thank you!
Same as #820