Daniel Plominski
Daniel Plominski
Dear ollama team, please add support for llama3 based models like: https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual thanks in advance and best regards Daniel Plominski
Hello Fosowl, first of all thank you for your work and for the “AgenticSeek” project. We are always interested in on-premise hosted solutions, and I took a closer look at...
### 🥰 Feature Description { "error": { "message": "Model incompatible request arguments supplied: frequency_penalty, presence_penalty, temperature, top_p", "type": "invalid_request_error", "param": null, "code": null } } ### 🧐 Proposed Solution fix...
@atupem Since Ollama doesn’t support “supports_function_calling,” we’ve switched to vLLM. However, our current parameters/configuration don’t work with ByteBot. Could you help us? **vLLM Server** (Docker / Config) - Proxmox VM...