OLLAMA forced to use local process and has not API override[ISSUE]
Ollama running remotely via API should NEVER have the API sitting exposed it's a security risk, why wouldn't you have the same options in the settings panel to set it manually?
I don't see this as an issue because ollama and devika aren't built for security--it's counterproductive to spend energy on over-engineering for security issues. You shouldn't be using something like this in a secured setting.
On Sat, May 18, 2024 at 3:06 PM Mookins @.***> wrote:
Ollama running remotely via API should NEVER have the API sitting exposed it's a security risk, why wouldn't you have the same options in the settings panel to set it manually?
— Reply to this email directly, view it on GitHub https://github.com/stitionai/devika/issues/569, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFACJ5VZQQOXIH552J2BKZ3ZC6RELAVCNFSM6AAAAABH5V4XTSVHI2DSMVQWIX3LMV43ASLTON2WKOZSGMYDIMRXGYZTAMA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
Thats the most ridiculous statement, force local use only so anyone who runs a separate machine has to put extra time in when it could just work, its not even like adding it would have been a sinkhole, the options are there are were built for the other endpoints, openAi compatible endpoints exists it would have been less work to just code it with this in mind.
and finally just because something isn't primarily focused on security doesn't mean it should just ignore it... There is a reason many projects in this space are adding security features in... but by all means you keep running your public facing APIs with no security.
Its literally just allowing an API key input box.... Im still baffled by the stupid comment.
Ollama running remotely via API should NEVER have the API sitting exposed it's a security risk,
Use openwebui to encapsulate the Ollama API..... Openwebui have sacure API for web ollama......
I don't see this as an issue because ollama and devika aren't built for security--it's counterproductive to spend energy on over-engineering for security issues. You shouldn't be using something like this in a secured setting.
@cafeTechne don't loose time with this, I'm using GPT4FREE + Ollama + Meta-API all encapsulated with openwebui, with user registration, email control, JWT token renew...... This dude don't know nothing about LLM's. No local application is Exposure Ready...... the user need to implement ALL the safety features.....