Nischal Jain
Nischal Jain
hey @eleksis can you mark this as closed if its done?
For me, when O1 is done performing a task, it can't do follow ups because of similar issue. First instruction works well.
Hi @marcklingen, I have been liking the platform till now but the lack of support for base64 images and large events is a big bummer. any TL on when these...
any updates on this? the 4b intern model is killer for its size! would love to see it supported with llama.cpp
Following. Were you able to resolve this?
@Jackwaterveg were you able to figure this out?
Prior discussion #402
[Feature]: Ollama Built-in Support for Structured Output (i.e. for models that don't yet support it)
following this, would be great to have more support for structured outputs for providers that don't support it yet directly in litellm rather than clubbing it with libraries like instructor.
That'd be awesome. Thanks!