add other ollama visual llms
What does this PR do?
Adds bakllava, llama3-llava, llava:13b, etc.
Requirement/Documentation
https://ollama.com/library?q=llava
Type of change
- [x] New feature (non-breaking change which adds functionality)
- [x] This change requires a documentation update
Mandatory Tasks
- [x] Make sure you have self-reviewed the code. A decent size PR without self-review might be rejected. Make sure before submmiting this PR you run tests with evaluate.py
@ketsapiwiq this looks like a great PR. Sorry I never merged it. I'm revisiting this project. I've been focused on other priorities.
If you want to resolve the conflicts and confirm you still want this merged in then test and merge in after.
Hi, yes I can rebase it but my rudimentary check if "llava" in model doesn't work anymore as now a lot of models do vision without them being called "llava", see: https://ollama.com/search?q=vision
Edit: maybe just pull llama3.2-vision by default as it's one of the best currently
chiming in this PR since i was about to do the same work.
i think it's better if we just let ollama be the interface for the function by explicitly removing references to llava so it's easier to use other models and not require a code change. for ex, i'm currently using llama3.2-vision and i don't need to have to put that into the if statement just to be able to run it.
Here, so by default it'll try with ollama :)
@ketsapiwiq, great is the PR ready now? I'll take a look this week
Yes! Thank you :)