Is there any chance to setup plugin with ollama-copilot
There is a settings that allow to work with it in VSCode
{
"github.copilot.advanced": {
"debug.overrideProxyUrl": "http://localhost:11437"
},
"http.proxy": "http://localhost:11435",
"http.proxyStrictSSL": false
}
Is there any chance to connect with your plugin? https://github.com/bernardo-bruning/ollama-copilot
No. It looks like https://github.com/bernardo-bruning/ollama-copilot uses OpenAI-compatible APIs. The only common part is they both have "copilot" in their names.
https://packagecontrol.io/packages/OpenAI%20completion would be your choice imo.
Actually, isn't the entire point of ollama-copilot to translate the OpenAI-like API from Ollama into something that official/regular Copilot plugins can understand? The repo shows an example with the official VS Code plugin in any case, which wouldn't work if it actually exposed OpenAI APIs. And indeed, completions seem to work fine if I hack LSP-copilot only a little:
diff --git a/plugin/client.py b/plugin/client.py
index c72826f..9e995d4 100644
--- a/plugin/client.py
+++ b/plugin/client.py
@@ -191,12 +191,13 @@ class CopilotPlugin(NpmClientHandler):
if not proxy:
return None
parsed = urlparse(f"http://{proxy}")
+ strict_tls = settings.get("proxy_strict_tls")
return {
"host": parsed.hostname or "",
"port": parsed.port or 80,
"username": parsed.username or "",
"password": parsed.password or "",
- "rejectUnauthorized": True,
+ "rejectUnauthorized": bool(strict_tls) if strict_tls is not None else True
}
super().on_settings_changed(settings)
Its settings in Sublime:
{
"settings": {
"completion_style": "phantom",
"proxy": "127.0.0.1:11435",
"proxy_strict_tls": false,
},
}
I didnt do any test but if that works with tiny changes, feel free to just make a PR.
Unfortunately I cannot run any somewhat sizeable models for higher quality completions, but it seems to work fairly well. A couple of things I noticed:
-
Some models may have additional markers besides those used for fill-in-middle, like
<file_separator>. Since I don't think Copilot itself even supports that, LSP-copilot (and/or ollama-copilot) will of course not strip that stuff from the response either. Which is cool, that's not your problem but a user error in selecting the "wrong" model. -
Beyond that, it seems the plugin has a bit of trouble staying "signed in". Usually restarting either just the LSP server or Sublime entirely will make the status bar thingy revert to
Copilot has NOT been not signed in, and I actually have to useCopilot: Check Statusto get it working again. This doesn't happen with the actual Copilot. I'm not entirely sure if that's a problem with LSP-copilot or just the proxy though.
You still want me to make a PR, or should we perhaps wait a bit for someone else to try this first?
packagecontrol.io/packages/OpenAI%20completion would be your choice imo.
Then I would probably just suggest this.
I was able to make it work following this guide: https://alex.kirk.at/2024/11/15/setting-up-a-local-ollama-copilot-via-lsp/
But only with codellama. I have played with the template and added a system prompt but I'm not able to make it work with qwen-2.5-coder or other models. Anyone have any idea?