oxaronick
oxaronick
This would be nice to have for my use case - running Ollama with a single model in production for many users.
I would have said the server value should override the client value. If I'm running ollama locally with multiple models I can just not set the server param and then...
That would also work nicely.
If Zed supports ctrl-tab to switch between recent tabs the way VSCode and Firefox do, that will be great.
@hemangjoshi37a I get where you're coming from, but a button that pulled new code and ran it on the fly wouldn't work for everyone. If that was implemented, there should...
That fixed it for me too. Thanks, @ahmedJaafari .
Ollama support would be great.
Changed my mind on this - Ollama doesn't deal well with multiple clients, but Fauxpilot as it is does a great job at that.
> Can this also be made to work with any local server running an API similar to OpenAI's API? Specifically, I'm interested in using LM Studio. Same here, I'm using...
> just integrate continue.dev please this will exponentially increase adoption as continue dev solves all of llm worries and works with all possible providers both local and cloud Yeah, Continue...