Suppport for airgapped installation
I want to run opencode in kubernetes in an airgapped environment, which only has access to my model endpoint. I installed opencode in my docker image using
RUN curl -fsSL https://opencode.ai/install | bash
But when I try to run opencode in my pod after deployment, I get the following error:
> OPENCODE_CONFIG=claude.jsonc opencode run "what is 2+2"
INFO 2025-08-24T16:33:14 +0ms service=plugin [email protected] loading plugin
INFO 2025-08-24T16:33:14 +1ms service=bun pkg=opencode-copilot-auth version=0.0.2 installing package using Bun's default registry resolution
INFO 2025-08-24T16:33:14 +0ms service=bun cmd=["/root/.opencode/bin/opencode","add","--force","--exact","--cwd","/root/.cache/opencode","[email protected]"] cwd=/root/.cache/opencode running
INFO 2025-08-24T16:33:14 +26ms service=bun code=1 stdout=bun add v1.2.19 (aad3abea)
stderr=Resolving dependencies
Resolved, downloaded and extracted [6]
error: ConnectionRefused downloading package manifest opencode-copilot-auth
done
ERROR 2025-08-24T16:33:14 +6ms service=default pkg=opencode-copilot-auth version=0.0.2 name=BunInstallFailedError message=BunInstallFailedError cause=Error: Command failed with exit code 1 fatal
I would love it if there was a config or environment variable which would prevent network access of any kind, except the model API call. Thanks.
This issue might be a duplicate of existing issues. Please check:
- #1962: Config option for disabling autofetch of lsp servers - addresses similar concerns about preventing network access for downloading remote binaries in environments where network access should be restricted
Feel free to ignore if none of these address your specific case.
yeah we should do some environment var for sure
I got a very similar error message and started digging through the code and found these.
OPENCODE_DISABLE_DEFAULT_PLUGINS=true
OPENCODE_DISABLE_LSP_DOWNLOAD=true
But you still get stuck with opencode trying to install provider sdks. Would love to see a workaround for this.
it is a little more complicated than just a var depending on the models you use since we don't ship all sdks out of the box
I think for offline use, it would make sense to just include the open ai compatible sdk by default.
I managed to get it working by downloading the dependencies during docker build, but I was not able to recreate it in current versions.
Perhaps an interim solution could be having a flag that prevents the download like the LSPs as it would allow us to build the image or otherwise download the packages separately from running.
But I agree it would be nice if you could get a standalone airgapped version with all dependencies included.
But I agree it would be nice if you could get a standalone airgapped version with all dependencies included.
I can live without having all (LSP, copilot/anthropic auth) dependencies. But the lack of provider SDKs being included is legitimately a blocker keeping me from using opencode in certain environments. A shame since I do enjoy using it where I can.
@rekram1-node do you know if there's interest in potential baking in SDKs from the maintainer?
I will ask
@jamestrew which providers do you need? you could install the providers yourself and then install opencode, but perhaps all providers should be bundled out of box
I managed to get it working by downloading the dependencies during docker build, but I was not able to recreate it in current versions.
Perhaps an interim solution could be having a flag that prevents the download like the LSPs as it would allow us to build the image or otherwise download the packages separately from running.
But I agree it would be nice if you could get a standalone airgapped version with all dependencies included.
Trying to do this as well - any pointers?
@rekram1-node I think bundling "@ai-sdk/openai-compatible" into builds would make a lot of sense for an offline-only kind of mode, mostly since in that kind of capacity you're only going to be hitting models that are being served locally.
@jpsullivan yeah I mentioned this to dax, we may just bundle all the providers because only a few are really used:
- anthropic
- openai
- openai compatible (most fall under this category)
that accounts for 99% of usage
That would be perfect!
To be clear... other than trying to download these provider models is it possible to run fully locally? I'm trying to make sure no telemetry will not be sent upstream if I'm running a local model.
@dnollshai there is no telemetry sent upstream
+1 I am currently stuck in this issue where basically all agents I try lately (CLI or plugins) are trying to connect and fail due to lack of connectivity.
This is indeed the guy I think that needs to be there @ai-sdk/openai-compatible Then copilot and all the models.dev things need to be disabled somehow.
I did not get there yet, but pretty sure the issue will come back once I start activating MCPs with npm, but I think these are easier to install globally in the image, as suggested above.
gonna start bundling in the providers by default in next release or two
Another comment if I may, opencode also gets stuck for quite some time trying to fetch models.dev. Perhaps also add a way to skip this?
I am trying to set the provider list in the json config to empty for opencode, anthropic and openai. Not sure how to disable the models.dev yet.
I believe this behavior needs to be removed:
https://github.com/sst/opencode/blob/598d6d00e41d39767d46359b4238286f360ddc6c/packages/opencode/test/bun.test.ts#L42
gonna start bundling in the providers by default in next release or two
We'd love to use opencode for our airgapped company setup to have an alternative to claude. Any update on bundling? @thdxr @rekram1-node